Dual-dictionary learning-based iterative image reconstruction for spectral computed tomography application

基于双字典学习的迭代图像重建方法在光谱计算机断层扫描应用中的应用

阅读:1

Abstract

In this study, we investigated the effectiveness of a novel iterative reconstruction (IR) method coupled with dual-dictionary learning (DDL) for image reconstruction in a dedicated breast computed tomography (CT) system based on a cadmium-zinc-telluride (CZT) photon-counting detector and compared it to the filtered-back-projection (FBP) method with the ultimate goal of reducing the number of projections necessary for reconstruction without sacrificing the image quality. Postmortem breast samples were scanned in a fan-beam CT system and were reconstructed from 100 to 600 projections with both IR and FBP methods. The contrast-to-noise ratio (CNR) between the glandular and adipose tissues of the postmortem breast samples was calculated to compare the quality of images reconstructed from IR and FBP. The spatial resolution of the two reconstruction techniques was evaluated using aluminum (Al) wires with diameters of 643, 813, 1020, 1290 and 1630 µm in a plastic epoxy resin phantom with a diameter of 13 cm. Both the spatial resolution and the CNR were improved with IR compared to FBP for the images reconstructed from the same number of projections. In comparison with FBP reconstruction, the CNR was improved from 3.4 to 7.5 by using the IR method with six-fold fewer projections while maintaining the same spatial resolution. The study demonstrated that the IR method coupled with DDL could significantly reduce the required number of projections for a CT reconstruction compared to the FBP method while achieving a much better CNR and maintaining the same spatial resolution. From this, the radiation dose and scanning time can potentially be reduced by a factor of approximately 6 by using this IR method for image reconstruction in a CZT-based breast CT system.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。