explainable artificial intelligence; energy time-series forecasting; model interpretability; load/price/generation prediction; domain-specific XAI
Abstract :
[en] Despite the growing use of Explainable Artificial Intelligence (XAI) in energy time-series forecasting, a systematic evaluation of explanation quality remains limited. This systematic review analyzes 50 peer-reviewed studies (2020–2025) applying XAI to load, price, or renewable generation forecasting. Using a PRISMA-inspired protocol, we introduce a dual-axis taxonomy and a four-factor framework covering global transparency, local fidelity, user relevance, and operational viability to structure our qualitative synthesis. Our analysis reveals that XAI application is not uniform but follows three distinct, domain-specific paradigms: a user-centric approach in load forecasting, a risk management approach in price forecasting, and a physics-informed approach in generation forecasting. Post hoc methods, particularly SHAP, dominate the literature (62% of studies), while rigorous testing of explanation robustness and the reporting of computational overhead (23% of studies) remain critical gaps. We identify key research directions, including the need for standardized robustness testing and human-centered design, and provide actionable guidelines for practitioners.
FRANK, Raphaël ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > Ubiquitous and Intelligent Systems (UBI-X)
External co-authors :
no
Language :
English
Title :
A Four-Dimensional Analysis of Explainable AI in Energy Forecasting: A Domain-Specific Systematic Review
This research was funded by the Luxembourg National Research Fund (FNR), under the National Center of Excellence in Research (NCER) program as part of the D2ET (Data-Driven Energy Transition) grant number 38/44D2ET.