Abstract
Neural network (NN) models often struggle to remain accurate when underlying systems drift from training conditions, necessitating costly and time-consuming model maintenance. Traditional approaches - such as full retraining or finetuning - are either computationally intensive or prone to generalization loss and require extensive hyperparameter tuning. To address these limitations, we propose the Subset Extended Kalman Filter (SEKF), an online learning method that updates only a strategically selected subset of network parameters to maintain model fidelity as system dynamics evolve. SEKF identifies the NN parameters with the greatest impact on prediction error by analyzing the gradient of the training loss function. It then applies Kalman filtering to update only these critical parameters when new data become available. This targeted adaptation preserves the network's structure while enabling efficient, real-time updates with predictable computational overhead and minimal manual effort. We validate SEKF through four case studies, ranging from synthetic systems to complex industrial processes, including a fluid catalytic cracker. Across all scenarios, SEKF outperforms conventional retraining and finetuning methods in both accuracy and efficiency, reducing computational time per iteration. The approach offers a practical path toward adaptive neural network deployment in industrial settings, where maintaining accuracy as the underlying physical system evolves.