Reference : Faster and Cheaper Energy Demand Forecasting at Scale |
Scientific congresses, symposiums and conference proceedings : Paper published in a book | |||
Engineering, computing & technology : Computer science | |||
Computational Sciences | |||
http://hdl.handle.net/10993/53152 | |||
Faster and Cheaper Energy Demand Forecasting at Scale | |
English | |
Bernier, Fabien ![]() | |
Jimenez, Matthieu ![]() | |
Cordy, Maxime ![]() | |
Le Traon, Yves ![]() | |
2-Dec-2022 | |
Has it Trained Yet? Workshop at the Conference on Neural Information Processing Systems | |
Yes | |
International | |
Has it Trained Yet? Workshop at the Conference on Neural Information Processing Systems | |
02-12-2022 | |
[en] machine learning lightness ; power consumption ; forecasting ; transformer | |
[en] Energy demand forecasting is one of the most challenging tasks for grids operators.
Many approaches have been suggested over the years to tackle it. Yet, those still remain too expensive to train in terms of both time and computational resources, hindering their adoption as customers behaviors are continuously evolving. We introduce Transplit, a new lightweight transformer-based model, which significantly decreases this cost by exploiting the seasonality property and learning typical days of power demand. We show that Transplit can be run efficiently on CPU and is several hundred times faster than state-of-the-art predictive models, while performing as well. | |
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Security Design and Validation Research Group (SerVal) | |
Secure, reliable and predictable smart grid | |
Researchers | |
http://hdl.handle.net/10993/53152 |
File(s) associated to this reference | ||||||||||||||
Fulltext file(s):
| ||||||||||||||
All documents in ORBilu are protected by a user license.