Active Learning for Efficient Segmentation of Liver with Convolutional Neural Network-Corrected Labeling in Magnetic Resonance Imaging-Derived Proton Density Fat Fraction

基于卷积神经网络校正标签的主动学习方法在磁共振成像衍生质子密度脂肪分数肝脏分割中的应用

阅读:1

Abstract

This study aimed to propose an efficient method for self-automated segmentation of the liver using magnetic resonance imaging-derived proton density fat fraction (MRI-PDFF) through deep active learning. We developed an active learning framework for liver segmentation using labeled and unlabeled data in MRI-PDFF. A total of 77 liver samples on MRI-PDFF were obtained from patients with nonalcoholic fatty liver disease. For the training, tuning, and testing of the liver segmentation, the ground truth of 71 (internal) and 6 (external) MRI-PDFF scans for training and testing were verified by an expert reviewer. For 100 randomly selected slices, manual and deep learning (DL) segmentations for visual assessments were classified, ranging from very accurate to mostly accurate. The dice similarity coefficients for each step were 0.69 ± 0.21, 0.85 ± 0.12, and 0.94 ± 0.01, respectively (p-value = 0.1389 between the first step and the second step or p-value = 0.0144 between the first step and the third step for paired t-test), indicating that active learning provides superior performance compared with non-active learning. The biases in the Bland-Altman plots for each step were - 24.22% (from - 82.76 to - 2.70), - 21.29% (from - 59.52 to 3.06), and - 0.67% (from - 10.43 to 4.06). Additionally, there was a fivefold reduction in the required annotation time after the application of active learning (2 min with, and 13 min without, active learning in the first step). The number of very accurate slices for DL (46 slices) was greater than that for manual segmentations (6 slices). Deep active learning enables efficient learning for liver segmentation on a limited MRI-PDFF.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。