SBCS-Net: Sparse Bayesian and Deep Learning Framework for Compressed Sensing in Sensor Networks

SBCS-Net:用于传感器网络压缩感知的稀疏贝叶斯和深度学习框架

阅读:1

Abstract

Compressed sensing is widely used in modern resource-constrained sensor networks. However, achieving high-quality and robust signal reconstruction under low sampling rates and noise interference remains challenging. Traditional CS methods have limited performance, so many deep learning-based CS models have been proposed. Although these models show strong fitting capabilities, they often lack the ability to handle complex noise in sensor networks, which affects their performance stability. To address these challenges, this paper proposes SBCS-Net. This framework innovatively expands the iterative process of sparse Bayesian compressed sensing using convolutional neural networks and Transformer. The core of SBCS-Net is to optimize key SBL parameters through end-to-end learning. This can adaptively improve signal sparsity and probabilistically process measurement noise, while fully leveraging the powerful feature extraction and global context modeling capabilities of deep learning modules. To comprehensively evaluate its performance, we conduct systematic experiments on multiple public benchmark datasets. These studies include comparisons with various advanced and traditional compressed sensing methods, comprehensive noise robustness tests, ablation studies of key components, computational complexity analysis, and rigorous statistical significance tests. Extensive experimental results consistently show that SBCS-Net outperforms many mainstream methods in both reconstruction accuracy and visual quality. In particular, it exhibits excellent robustness under challenging conditions such as extremely low sampling rates and strong noise. Therefore, SBCS-Net provides an effective solution for high-fidelity, robust signal recovery in sensor networks and related fields.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。