machine learning lightness; power consumption; forecasting; transformer
Abstract :
[en] Energy demand forecasting is one of the most challenging tasks for grids operators.
Many approaches have been suggested over the years to tackle it. Yet, those still remain too expensive to train in terms of both time and computational resources, hindering their adoption as customers behaviors are continuously evolving.
We introduce Transplit, a new lightweight transformer-based model, which significantly decreases this cost by exploiting the seasonality property and learning typical days of power demand. We show that Transplit can be run efficiently on CPU and is several hundred times faster than state-of-the-art predictive models, while performing as well.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Security Design and Validation Research Group (SerVal)
Disciplines :
Computer science
Author, co-author :
Bernier, Fabien ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
Jimenez, Matthieu ; University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
Cordy, Maxime ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
Le Traon, Yves ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
External co-authors :
no
Language :
English
Title :
Faster and Cheaper Energy Demand Forecasting at Scale
Publication date :
02 December 2022
Event name :
Has it Trained Yet? Workshop at the Conference on Neural Information Processing Systems
Event date :
02-12-2022
Audience :
International
Main work title :
Has it Trained Yet? Workshop at the Conference on Neural Information Processing Systems