CMAP-Fusion: A cross-modal feature selection and model pruning framework for laboratory and imaging data

CMAP-Fusion:一种用于实验室和成像数据的跨模态特征选择和模型剪枝框架

阅读:1

Abstract

Cross-modal fusion of medical imaging and laboratory data is a key pathway for accurate diagnosis of diseases, yet it is constrained by issues such as the modal heterogeneity gap, accumulation of feature redundancy, and efficiency imbalance. Existing methods struggle to balance precision and clinical adaptability, and some rely on simulated data leading to limited generalization ability. To address these challenges, we propose the Cross-Modal Alignment-Pruning Fusion model (CMAP-Fusion), which achieves optimization through modular collaboration of "encoding alignment → redundant pruning → fusion prediction": ViT-B/16 is used to complete imaging feature extraction and dimension alignment, the SmartTrim dynamic pruning module screens key features and reduces redundancy, and the Cross-Modal Transformer (CMT) mines deep associations between dual modalities. Experiments on the COVID-19 Radiography Dataset, ISIC Skin Cancer Dataset, and ChestX-ray14 Dataset demonstrate that the model achieves accuracies of 95.3%, 89.7%, and 93.6% respectively, representing an improvement of 3.1% to 4.1% compared with optimal baselines. Meanwhile, the number of parameters is reduced by 44.2%, computational complexity is decreased by more than 43%, and cross-modal similarity and feature sparsity are significantly superior to baselines. This model realizes the synergistic optimization of "precision-efficiency-generalization," providing an efficient solution for medical cross-modal fusion. In the future, we will expand to multi-source modalities and multi-disease scenarios, strengthen clinical multi-center validation, further improve the model's interpretability and clinical acceptance, and facilitate the lightweight deployment of medical AI.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。