Training high-performance deep learning classifier for diagnosis in oral cytology using diverse annotations

利用多样化的标注训练高性能深度学习分类器用于口腔细胞学诊断

阅读:1

Abstract

The uncertainty of true labels in medical images hinders diagnosis owing to the variability across professionals when applying deep learning models. We used deep learning to obtain an optimal convolutional neural network (CNN) by adequately annotating data for oral exfoliative cytology considering labels from multiple oral pathologists. Six whole-slide images were processed using QuPath for segmenting them into tiles. The images were labeled by three oral pathologists, resulting in 14,535 images with the corresponding pathologists' annotations. Data from three pathologists who provided the same diagnosis were labeled as ground truth (GT) and used for testing. We investigated six models trained using the annotations of (1) pathologist A, (2) pathologist B, (3) pathologist C, (4) GT, (5) majority voting, and (6) a probabilistic model. We divided the test by cross-validation per slide dataset and examined the classification performance of the CNN with a ResNet50 baseline. Statistical evaluation was performed repeatedly and independently using every slide 10 times as test data. For the area under the curve, three cases showed the highest values (0.861, 0.955, and 0.991) for the probabilistic model. Regarding accuracy, two cases showed the highest values (0.988 and 0.967). For the models using the pathologists and GT annotations, many slides showed very low accuracy and large variations across tests. Hence, the classifier trained with probabilistic labels provided the optimal CNN for oral exfoliative cytology considering diagnoses from multiple pathologists. These results may lead to trusted medical artificial intelligence solutions that reflect diverse diagnoses of various professionals.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。