Cascaded atrous convolution and spatial pyramid pooling for more accurate tumor target segmentation for rectal cancer radiotherapy

级联空洞卷积和空间金字塔池化技术可实现更精确的直肠癌放射治疗肿瘤靶区分割

阅读:1

Abstract

Convolutional neural networks (CNNs) have become the state-of-the-art method for medical segmentation. However, repeated pooling and striding operations reduce the feature resolution, causing loss of detailed information. Additionally, tumors of different patients are of different sizes. Thus, small tumors may be ignored while big tumors may exceed the receptive fields of convolutions. The purpose of this study is to further improve the segmentation accuracy using a novel CNN (named CAC-SPP) with cascaded atrous convolution (CAC) and a spatial pyramid pooling (SPP) module. This work is the first attempt at applying SPP for segmentation in radiotherapy. We improved the network based on ResNet-101 yielding accuracy gains from a greatly increased depth. We added CAC to extract a high-resolution feature map while maintaining large receptive fields. We also adopted a parallel SPP module with different atrous rates to capture the multi-scale features. The performance was compared with the widely adopted U-Net and ResNet-101 with independent segmentation of rectal tumors for two image sets, separately: (1) 70 T2-weighted MR images and (2) 100 planning CT images. The results show that the proposed CAC-SPP outperformed the U-Net and ResNet-101 for both image sets. The Dice similarity coefficient values of CAC-SPP were 0.78  ±  0.08 and 0.85  ±  0.03, respectively, which were higher than those of U-Net (0.70  ±  0.11 and 0.82  ±  0.04) and ResNet-101 (0.76  ±  0.10 and 0.84  ±  0.03). The segmentation speed of CAC-SPP was comparable with ResNet-101, but about 36% faster than U-Net. In conclusion, the proposed CAC-SPP, which could extract high-resolution features with large receptive fields and capture multi-scale context yields, improves the accuracy of segmentation performance for rectal tumors.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。