Dual-model approach for concurrent forecasting of electricity prices and loads in smart grids: Comparison of sparse encoder NAR and GA-optimized LSTM

智能电网中电力价格和负荷同步预测的双模型方法:稀疏编码器NAR与GA优化LSTM的比较

阅读:1

Abstract

Accurate forecasting of electricity prices and loads is challenging in smart grids due to the strong interdependence between load and price. To address this, we propose two deep recurrent neural network models that forecast both load and price concurrently. The first model, Sparse Encoder Nonlinear Autoregressive Network (SENARX), introduces a sparse encoder for enhanced feature extraction and nonlinear autoregression with exogenous inputs. The second, GA-LSTM, integrates Long Short-Term Memory with genetic algorithm-based optimization to improve forecasting accuracy and robustness. Both models were evaluated using ISO New England data and outperformed benchmark models. The NARX model achieves MAPE values of 0.03 for load and 0.08 for price forecasting, while LSTM shows MAPE values of 1.53 and 1.91, respectively. The models demonstrate promising potential for real-time forecasting in smart grids. This paper presents a comparative study of SENARX and GA-LSTM against traditional methods such as ARIMA, SVM, and Bayesian Networks using market data from EPEX (Europe), IEX (India), and JEPX (Japan). SENARX achieved a MAPE of 3.82% (EPEX) and 4.13% (IEX), while GA-LSTM reached RMSE of 27.02 MW (EPEX) and 29.33 MW (JEPX). Compared to ARIMA (MAPE: 6.57%-7.21%, RMSE: up to 48.74 MW), the proposed models improved accuracy by over 40%. SENARX also trained faster (2385s vs 3100s for ARIMA). GA-LSTM showed faster convergence and lower error rates, and SENARX was robust against data noise. These characteristics make the models suitable for short-term load forecasting in dynamic and uncertain markets. Future work will test their performance under extreme events like peak demand and climate anomalies.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。