A novel parametric scaled exponential linear unit activation function for deep residual networks in short-term load forecasting

一种用于短期负荷预测的深度残差网络的新型参数化缩放指数线性单元激活函数

阅读:1

Abstract

Short-Term Load Forecasting (STLF) is essential for ensuring the stability and operational efficiency of modern power systems. Deep Residual Networks (DRNs) have recently demonstrated promising performance in this domain, enabling more effective training of deeper architectures. However, existing activation functions such as the Scaled Exponential Linear Unit (SELU) rely on fixed parameters and strict initialization, which may limit their adaptability to seasonally varying load–weather conditions. This study introduces a modified activation function, the Parametric Scaled Exponential Linear Unit (PSELU), which incorporates a small tunable parameter γ in the negative region, extending the formulation of SELU while preserving its self-normalizing characteristics. Experiments conducted within the DRN framework on two benchmark datasets—ISO-NE (temperate climate) and Malaysia (tropical climate)—demonstrate that the DRN model employing the proposed PSELU (γ = 0.02) achieves modest yet consistent improvements in forecasting accuracy compared with the DRN model using SELU. Specifically, the Mean Absolute Percentage Error (MAPE) decreased from 1.718% to 1.662% for ISO-NE and from 5.251% to 5.012% for Malaysia. Although the improvements are moderate, they were statistically validated through 10,000-iteration Bootstrap resampling at the 95% confidence level. These results suggest that the limited parameterization of SELU enhances consistency and adaptability in forecasting performance across different climatic and seasonal conditions. Future work will expand the evaluation to a wider range of datasets and model architectures to further examine the generalizability and practical applicability of PSELU in diverse forecasting contexts.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。