Multiscale attention generative adversarial networks for lesion synthesis in chest X-ray images

用于胸部X光图像病灶合成的多尺度注意力生成对抗网络

阅读:1

Abstract

Recent advancements in deep learning have led to significant improvements in pneumoconiosis diagnosis from chest X-rays (CXR). However, these models typically require large training datasets, which are challenging to collect due to the rarity of the disease and strict data-sharing limitations. In addition, the process of annotating medical images is labor-intensive, requiring highly skilled radiologists, which further limits data availability. To address this data scarcity, we propose a novel approach to generate synthetic pathology in CXR images, thus augmenting existing datasets and improving model training efficiency. While bidirectional generative adversarial networks (GAN), such as CycleGAN, can perform image translation between domains without paired samples, these methods often struggle to maintain structural and pathological consistency, especially with fine details. This study introduces a multiscale attention-based GAN (MSA-GAN) model that enhances CycleGAN with a multiscale attention generator, local-global discriminators and structural similarity (SSIM) loss, ensuring greater fidelity in preserving structural and pathological details during translation. We utilise MSA-GAN-generated synthetic CXR images to train CNN models for pneumoconiosis classification and lung segmentation tasks. Experimental results indicate that CNN trained on MSA-GAN-generated images outperforms existing CNN-based methods, showing improved accuracy and consistency in both qualitative and quantitative assessments across classification and segmentation tasks.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。