Multi-modal low-dose medical imaging through instruction-guided unified AI

通过指令引导的统一人工智能实现多模态低剂量医学成像

阅读:2

Abstract

BACKGROUND: Ionizing radiation from PET/CT warrants dose reduction. However, lowering dose can degrade image quality and affect diagnosis. Many machine-learning approaches exist. Nevertheless, most are built for a single task and are difficult to deploy across multi-modal workflows. We sought to develop and evaluate a unified model that handles common restoration tasks across modalities. METHODS: We developed the Multi-modal Instruction-guided Restoration Architecture (MIRA-Net), a U-Net-based framework with an adaptive guidance module. The module estimates modality and degradation indicators from the input and produces a low-dimensional instruction that modulates feature processing throughout the network, selecting task-appropriate pathways within a single model. Performance was assessed on CT denoising, PET synthesis, and MRI super-resolution. Additionally, a double-blind reader study was conducted with board-certified radiologists. RESULTS: Trained on individual tasks, MIRA-Net matched or exceeded strong task-specific baselines. When trained as a single unified model across CT, PET, and MRI, it maintained comparable performance without a meaningful drop from single-task training. Local clinical dataset validation demonstrated robust generalization with consistent performance metrics. In the reader study, MIRA-Net outputs were more often judged diagnostic and received higher scores for anatomical clarity, lesion conspicuity, and noise control. CONCLUSION: MIRA-Net provides a high-fidelity solution for multi-modal medical image restoration. Its instruction-guided architecture successfully mitigates task interference, demonstrating an effective pathway to reducing radiation exposure without sacrificing diagnostic quality.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。