Efficient Channel Attention-Gated Graph Transformer for Aero-Engine Remaining Useful Life Prediction

用于航空发动机剩余使用寿命预测的高效通道注意力门控图变换器

阅读:3

Abstract

Reliable estimation of the remaining useful life (RUL) of aero-engines plays a vital role in guaranteeing flight safety, increasing system dependability, and minimizing maintenance expenditure. Despite progress, many current deep learning techniques still encounter difficulties in effectively modeling the local sequential dependencies across multisource sensor readings and in capturing the progressive degradation behavior over extended operational periods. To overcome these drawbacks, this study introduces a new predictive framework termed the efficient channel attention (ECA)-gated graph transformer (EGG-Transformer), which synergistically combines graph convolutional networks (GCNs), an adaptive feature fusion mechanism based on ECA, and a transformer-based temporal encoder. The GCN facilitates learning of local time-step structures to boost the local sequential feature interpretation. The ECA-gated fusion module dynamically merges the raw input signals with structural information, leading to more expressive representations and reduced signal decay. Thereafter, the transformer encodes long-range temporal patterns across the engine lifespan. Validation on the benchmark C-MAPSS data set confirms the superiority of the proposed approach, which delivers consistently improved accuracy across all subdata setsachieving a 22.85% reduction in mean RMSE and a 10.21% decrease in the Score index compared to recent leading methodshighlighting its robustness and practical applicability in RUL forecasting.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。