CHARMS: A CNN-Transformer Hybrid with Attention Regularization for MRI Super-Resolution

CHARMS:一种用于MRI超分辨率的基于注意力正则化的CNN-Transformer混合模型

阅读:1

Abstract

Magnetic resonance imaging (MRI) super-resolution (SR) enables high-resolution reconstruction from low-resolution acquisitions, reducing scan time and easing hardware demands. However, most deep learning-based SR models are large and computationally heavy, limiting deployment in clinical workstations, real-time pipelines, and resource-restricted platforms such as low-field and portable MRI. We introduce CHARMS, a lightweight convolutional-Transformer hybrid with attention regularization optimized for MRI SR. CHARMS employs a Reverse Residual Attention Fusion backbone for hierarchical local feature extraction, Pixel-Channel and Enhanced Spatial Attention for fine-grained feature calibration, and a Multi-Depthwise Dilated Transformer Attention block for efficient long-range dependency modeling. Novel attention regularization suppresses redundant activations, stabilizes training, and enhances generalization across contrasts and field strengths. Across IXI, Human Connectome Project Young Adult, and paired 3T/7T datasets, CHARMS (~1.9M parameters; ~30 GFLOPs for 256 × 256) surpasses leading lightweight and hybrid baselines (EDSR, PAN, W2AMSN-S, and FMEN) by 0.1-0.6 dB PSNR and up to 1% SSIM at ×2/×4 upscaling, while reducing inference time ~40%. Cross-field fine-tuning yields 7T-like reconstructions from 3T inputs with ~6 dB PSNR and 0.12 SSIM gains over native 3T. With near-real-time performance (~11 ms/slice, ~1.6-1.9 s per 3D volume on RTX 4090), CHARMS offers a compelling fidelity-efficiency balance for clinical workflows, accelerated protocols, and portable MRI.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。