This magnetic resonance imaging multimodal fusion study aims to automate the staging of endometrial cancer using deep learning and to compare the diagnostic performance of deep learning with that of radiologists in the staging of endometrial cancer. This study retrospectively investigated 122 patients with pathologically confirmed early EC from January 1, 2025 to December 31, 2021. Of these patients, 68 were in the International Federation of Gynecology and Obstetrics (FIGO) stage IA, and 54 were in FIGO stage IB. Based on the Swin transformer model and its proprietary SW-MSA (shift window multiple self-coherence) module, magnetic resonance imaging (MRI) images in each of the three planes (sagittal, coronal, and transverse) are cropped, enhanced, and classified, and fusion experiments in the three planes are performed simultaneously. Selecting one plane for the experiment, the accuracy of IA and IB classification was 0.988 in the sagittal, 0.96 in the coronal, and 0.94 in the transverse position, and classification accuracy after the fusion of three planes reached 1. Finally, the automatic classification method based on the Swin transformer has an accuracy of 1, a recall of 1, and a specificity of 1 for early EC classification. In this study, the multimodal fusion approach accurately classified early EC. It was comparable to what a radiologist would perform and simpler and more precise than previous methods that required segmenting followed by staging.
Multimodal MRI Image Fusion for Early Automatic Staging of Endometrial Cancer.
阅读:3
作者:Zheng Ziyu, Liu Ye, Feng Longxiang, Liu Peizhong, Song Haisheng, Wang Lin, Huang Fang
| 期刊: | Sensors | 影响因子: | 3.500 |
| 时间: | 2025 | 起止号: | 2025 May 6; 25(9):2932 |
| doi: | 10.3390/s25092932 | ||
特别声明
1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。
2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。
3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。
4、投稿及合作请联系:info@biocloudy.com。
