FFTMed: leveraging fast-fourier transform for a lightweight and adversarial-resilient medical image segmentation framework

FFTMed:利用快速傅里叶变换构建轻量级且抗对抗的医学图像分割框架

阅读:1

Abstract

Accurate and reliable medical image segmentation is essential for computer-aided diagnosis and formulating appropriate treatment plans. However, noise often significantly reduces diagnostic accuracy and complicates treatment planning. Therefore, noise reduction in medical imaging is paramount, as it not only improves diagnostic accuracy but also contributes to enhanced treatment efficacy and minimizes patient risk. Prior methods have explored frequency-domain approaches to accelerate convolutional operations or combine frequency-based features with spatial convolutions. However, most only partially integrate Fourier-based processing and thus fail to fully exploit its advantages. We propose a novel neural architecture, FFTMed, that operates directly in the frequency domain, harnessing its resilience to noise and uneven brightness while also reducing computational overhead. Notably, FFTMed requires no additional noise augmentation during training yet remains resilient when confronted with noisy test images, demonstrating its effectiveness in real-world medical image segmentation tasks. Additionally, we propose a new benchmark incorporating various levels of noise to assess susceptibility to noise attacks. The experimental results demonstrate that FFTMed not only effectively eliminates noise and consistently achieves accurate image segmentation but also shows robust resistance to imperceptible adversarial attacks compared to other baseline models. The datasets generated and analysed during this study have been deposited in the Zenodo repository and are openly accessible at https://zenodo.org/records/15310397 . The source code to reproduce all experiments is publicly available at https://github.com/HySonLab/LightMed .

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。