Generate Structured Radiology Report from CT Images Using Image Annotation Techniques: Preliminary Results with Liver CT

利用图像标注技术从CT图像生成结构化放射学报告:肝脏CT的初步结果

阅读:1

Abstract

A medical annotation system for radiology images extracts clinically useful information from the images, allowing the machines to infer useful abstract semantics and become capable of automatic reasoning and making diagnostic decision. It also supplies human-interpretable explanation for the images. We have implemented a computerized framework that, given a liver CT image, predicts radiological annotations with high accuracy, in order to generate a structured report, which includes predicting very specific high-level semantic content. Each report of a liver CT image is related to different inhomogeneous parts like the liver, lesion, and vessel. We put forward a claim that gathering all kinds of features is not suitable for filling all parts of the report. As a matter of fact, for each group of annotations, one should find and extract the best feature that results in the best answers for that specific annotation. To this end, the main challenge is discovering the relationships between these specific semantic concepts and their association with the low-level image features. Our framework was implemented by combining a set of the state-of-the-art low-level imaging features. In addition, we propose a novel feature (DLBP (deep local binary pattern)) based on LBP that incorporates multi-slice analysis in CT images and further improves the performance. In order to model our annotation system, two methods were used, namely multi-class support vector machine (SVM) and random subspace (RS) which is an ensemble learning method. Applying this representation leads to a high prediction accuracy of 93.1% despite its relatively low dimension in comparison with the existing works.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。