Evaluation of frame-based and event-by-event motion-correction methods for awake monkey brain PET imaging

评估基于帧和逐事件的运动校正方法在清醒猴脑PET成像中的应用

阅读:1

Abstract

PET imaging of nonhuman primates (NHPs) requires correction of head motion if the subjects are scanned awake and their heads are unrestrained, because the NHPs move their heads faster and more frequently than human subjects. This work focuses on designing and validating 2 motion-correction algorithms for awake NHP brain PET imaging. METHODS: Two motion-correction methods were implemented for awake NHP brain PET imaging: multiacquisition frame (MAF) and event-by-event (EBE). Motion data were acquired from an external motion-tracking device. The MAF method divides scan data into short subframes, reconstructs each subframe individually, and registers them to a reference orientation. This method suffers from residual intraframe motion and data loss when motion is large because a minimum frame duration is often required. The EBE method, previously implemented for a human brain scanner and adapted for a small-animal PET scanner in this work, eliminates intraframe motion and should have a best accuracy. We first evaluated the accuracy of both motion-correction methods with moving phantom scans. Both motion-correction methods were then applied to awake NHP brain PET studies with a gamma-aminobutyric acid A-benzodiazepine receptor ligand, (11)C-flumazenil, and the reconstructed images were compared with those from a motion-free anesthetized study. RESULTS: The phantom studies showed that EBE motion correction recovers the contrast (within 3%) similarly to the static study, whereas MAF motion correction using the standard algorithm setting showed a 25% reduction in contrast from the static case. In awake NHP brain PET imaging, EBE motion correction better recovers the fine structures than the MAF method, as compared with anesthetized studies. CONCLUSION: The large magnitude and frequency of NHP head motion suggests that EBE motion correction with accurate externally measured motion data can noticeably alleviate image blurring due to the intraframe motion in the MAF motion-correction method.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。