Scatter correction for self-collimating SPECT using a 3D U-Net framework

使用 3D U-Net 框架对自准直 SPECT 进行散射校正

阅读:1

Abstract

Conventional single-photon emission computed tomography (SPECT) relies on mechanical collimators, which impose an inherent trade-off between spatial resolution and sensitivity. A novel cardiac SPECT system that employs a self-collimating design with interleaved mosaic scintillators has been proposed, which markedly enhances sensitivity without compromising resolution. However, the unique self-collimating and closely arranged detector geometry also introduces more complex scatter distribution and increased scatter fractions, making accurate scatter correction essential yet technically challenging. We employed a 3D U-Net framework to directly predict scatter-corrected images from uncorrected images. The network was trained using 36 distinct XCAT phantoms based on GATE simulations, with the true scatter-corrected images (true-SC) precisely obtained from the simulations serving as labels. Quantitative evaluation was performed using another two XCAT phantoms with different contrast levels: a high-contrast phantom (H-Phantom, 10 realizations) and a low-contrast phantom (L-Phantom, 5 realizations). The proposed U-Net approach were compared with two triple energy window (TEW) methods (trapezoidal and triangular). For both contrast levels, the U-Net-based approach achieved higher contrast recovery coefficients, myocardium-to-blood-pool ratios closer to the true-SC, higher contrast-to-noise ratios, and lower relative noise compared to the TEW methods. In addition, the U-Net-based method produced images with higher structural similarity and lower normalized mean square error relative to the true-SC reference, compared with the TEW-corrected images. In conclusion, the proposed 3D U-Net-based scatter correction method provides more accurate scatter estimation and superior quantitative performance for self-collimating SPECT systems than conventional TEW approaches.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。