Abstract
Time series widely exist in the real world, and a large part of them are long time series, such as weather information records and industrial production information records. The inherent long-term data dependence of long-time series has extremely high requirements on the feature extraction ability of the model. The sequence length of long time series also directly causes high computational cost, which requires the model to be more efficient. This paper proposes Concatenation-Informer containing a Pre-distilling operation and a Concatenation-Attention operation to predict long time series. The pre-distilling operation reduces the length of the series and effectively extracts context-related features. The Concatenation-Attention operation concatenates the attention mechanism's input and output to improve the efficiency of parameters. The total space complexity of the Concatenation-Informer is less than the complexity and usage of the Informer.
Original language | English |
---|---|
Title of host publication | ICAC 2023 - 28th International Conference on Automation and Computing |
Publisher | IEEE |
ISBN (Electronic) | 979-8-3503-3585-9 |
ISBN (Print) | 979-8-3503-3586-6 |
DOIs | |
Publication status | Published - 16 Oct 2023 |
Event | 2023 28th International Conference on Automation and Computing (ICAC) - Birmingham, United Kingdom Duration: 30 Aug 2023 → 1 Sept 2023 |
Publication series
Name | 2023 28th International Conference on Automation and Computing (ICAC) |
---|---|
Publisher | IEEE |
Conference
Conference | 2023 28th International Conference on Automation and Computing (ICAC) |
---|---|
Country/Territory | United Kingdom |
City | Birmingham |
Period | 30/08/23 → 1/09/23 |
Bibliographical note
Funding Information:*This work was supported by the National Natural Science Foundation of China [grant numbers U21B6002].