Abstract :
[en] Despite the growing use of Explainable Artificial Intelligence (XAI) in energy time-series forecasting, a systematic evaluation of explanation quality remains limited. This systematic review analyzes 50 peer-reviewed studies (2020–2025) applying XAI to load, price, or renewable generation forecasting. Using a PRISMA-inspired protocol, we introduce a dual-axis taxonomy and a four-factor framework covering global transparency, local fidelity, user relevance, and operational viability to structure our qualitative synthesis. Our analysis reveals that XAI application is not uniform but follows three distinct, domain-specific paradigms: a user-centric approach in load forecasting, a risk management approach in price forecasting, and a physics-informed approach in generation forecasting. Post hoc methods, particularly SHAP, dominate the literature (62% of studies), while rigorous testing of explanation robustness and the reporting of computational overhead (23% of studies) remain critical gaps. We identify key research directions, including the need for standardized robustness testing and human-centered design, and provide actionable guidelines for practitioners.
Funding text :
This research was funded by the Luxembourg National Research Fund (FNR), under the National Center of Excellence in Research (NCER) program as part of the D2ET (Data-Driven Energy Transition) grant number 38/44D2ET.
Scopus citations®
without self-citations
0