Eprint first made available on ORBilu (E-prints, Working papers and Research blog)
Transformer Multivariate Forecasting: Less is More?
XU, Jingjing; WU, Caesar (ming-wei); Li, Yuan-Fang et al.
2023
 

Files


Full Text
AAAI_AI4TS_2024_final_JX.pdf
Author preprint (9.34 MB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Abstract :
[en] In the domain of multivariate forecasting, transformer models stand out as powerful apparatus, displaying exceptional capabilities in handling messy datasets from real-world contexts. However, the inherent complexity of these datasets, characterized by numerous variables and lengthy temporal sequences, poses challenges, including increased noise and extended model runtime. This paper focuses on reducing redundant information to elevate forecasting accuracy while optimizing runtime efficiency. We propose a novel transformer forecasting framework enhanced by Principal Component Analysis (PCA) to tackle this challenge. The framework is evaluated by five state-of-the-art (SOTA) models and four diverse real-world datasets. Our experimental results demonstrate the framework’s ability to minimize prediction errors across all models and datasets while significantly reducing runtime. From the model perspective, one of the PCA-enhanced models: PCA+Crossformer, reduces mean square errors (MSE) by 33.3% and decreases runtime by 49.2% on average. From the dataset perspective, the framework delivers 14.3% MSE and 76.6% runtime reduction on Electricity datasets, as well as 4.8% MSE and 86.9% runtime reduction on Traffic datasets. This study aims to advance various SOTA models and enhance transformer-based time series forecasting for intricate data.
Disciplines :
Computer science
Author, co-author :
XU, Jingjing  ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
WU, Caesar (ming-wei)  ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > PCOG
Li, Yuan-Fang;  Monash University [AU]
BOUVRY, Pascal ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Language :
English
Title :
Transformer Multivariate Forecasting: Less is More?
Publication date :
2023
FnR Project :
FNR15748747 - Investigating Graph Neural Networks For Open-domain Question Answering, 2021 (01/06/2021-31/05/2025) - Jingjing Xu
Funders :
FNR - Luxembourg National Research Fund
Funding text :
This work was funded by the Luxembourg National Re- search Fund (Fonds National de la Recherche - FNR), Grant ID 15748747 and Grant ID C21/IS/16221483/CBD.
Available on ORBilu :
since 19 January 2024

Statistics


Number of views
107 (11 by Unilu)
Number of downloads
85 (5 by Unilu)

Bibliography


Similar publications



Contact ORBilu