Abstract
The relentless scaling of integrated circuits (ICs) into the nanoscale regime has intensified critical reliability challenges, such as Negative Bias Temperature Instability (NBTI), which manifests primarily as a progressive shift in the transistor threshold voltage ([Formula: see text]). This study focuses on the prognostics of digital integrated circuits (ICs) where Metal-Oxide-Semiconductor Field-Effect Transistors (MOSFETs) serve as the fundamental building blocks. Accurate prognostics of device degradation are paramount for predicting circuit lifetime, yet traditional models struggle with the nonlinearity of the degradation process. To address this challenge, a novel fusion architecture is proposed that synergistically combines the Sparrow Search Algorithm (SSA) with a Long Short-Term Memory (LSTM) network. The resulting SSA-LSTM model automates the optimization of crucial LSTM hyperparameters-including the number of hidden units, learning rate, and iteration count-thereby enhancing the capability to learn complex temporal degradation patterns. Empirical Mode Decomposition (EMD) is further integrated as a pre-processing step to denoise the temporal data. This model significantly reduces the average absolute error of [Formula: see text] prediction, outperforming benchmark models such as PSO and EMD-PSO. It can accurately capture the nonlinear trajectory of degradation curves and has high sensitivity for early anomaly detection, providing a high-precision data-driven solution for the reliability evaluation and prediction of digital ICs.