Remaining Useful Life (RUL) prediction is a crucial task in predictive maintenance. Currently, gated recurrent networks, hybrid models, and attention-enhanced models used for predictive maintenance face the challenge of balancing prediction accuracy and model lightweighting when extracting complex degradation features. This limitation hinders their deployment on resource-constrained edge devices. To address this issue, we propose TBiGNet, a lightweight Transformer-based classification network model for RUL prediction. TBiGNet features an encoder-decoder architecture that outperforms traditional Transformer models by achieving over 15% higher accuracy while reducing computational load, memory access, and parameter size by more than 98%. In the encoder, we optimize the attention mechanism by integrating the individual linear mappings of queries, keys, and values into an efficient operation, reducing memory access overhead by 60%. Additionally, an adaptive feature pruning module is introduced to dynamically select critical features based on their importance, reducing redundancy and enhancing model accuracy by 6%. The decoder innovatively fuses two different types of features and leverages BiGRU to compensate for the limitations of the attention mechanism in capturing degradation features, resulting in a 7% accuracy improvement. Extensive experiments on the C-MAPSS dataset demonstrate that TBiGNet surpasses existing methods in terms of computational accuracy, model size, and memory access, showcasing significant technical advantages and application potential. Experiments on the C-MPASS dataset show that TBiGNet is superior to the existing methods in terms of calculation accuracy, model size and throughput, showing significant technical advantages and application potential.
A Lightweight Transformer Edge Intelligence Model for RUL Prediction Classification.
一种用于 RUL 预测分类的轻量级 Transformer 边缘智能模型。
阅读:18
作者:
| 期刊: | Sensors | 影响因子: | 3.500 |
| 时间: | 2025 | 起止号: | 2025 Jul 6; 25(13):4224 |
| doi: | 10.3390/s25134224 | ||
特别声明
1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。
2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。
3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。
4、投稿及合作请联系:info@biocloudy.com。
