Deep learning based on co-registered ultrasound and photoacoustic imaging improves the assessment of rectal cancer treatment response

基于超声和光声成像联合配准的深度学习技术可提高直肠癌治疗反应的评估精度。

阅读:1

Abstract

Identifying complete response (CR) after rectal cancer preoperative treatment is critical to deciding subsequent management. Imaging techniques, including endorectal ultrasound and MRI, have been investigated but have low negative predictive values. By imaging post-treatment vascular normalization using photoacoustic microscopy, we hypothesize that co-registered ultrasound and photoacoustic imaging will better identify complete responders. In this study, we used in vivo data from 21 patients to develop a robust deep learning model (US-PAM DenseNet) based on co-registered dual-modality ultrasound (US) and photoacoustic microscopy (PAM) images and individualized normal reference images. We tested the model's accuracy in differentiating malignant from non-cancer tissue. Compared to models based on US alone (classification accuracy 82.9 ± 1.3%, AUC 0.917(95%CI: 0.897-0.937)), the addition of PAM and normal reference images improved the model performance significantly (accuracy 92.4 ± 0.6%, AUC 0.968(95%CI: 0.960-0.976)) without increasing model complexity. Additionally, while US models could not reliably differentiate images of cancer from those of normalized tissue with complete treatment response, US-PAM DenseNet made accurate predictions from these images. For use in the clinical settings, US-PAM DenseNet was extended to classify entire US-PAM B-scans through sequential ROI classification. Finally, to help focus surgical evaluation in real time, we computed attention heat maps from the model predictions to highlight suspicious cancer regions. We conclude that US-PAM DenseNet could improve the clinical care of rectal cancer patients by identifying complete responders with higher accuracy than current imaging techniques.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。