Local perturbation responses and checkerboard tests: Characterization tools for nonlinear MRI methods

局部扰动响应和棋盘格测试:非线性MRI方法的表征工具

阅读:1

Abstract

PURPOSE: Modern methods for MR image reconstruction, denoising, and parameter mapping are becoming increasingly nonlinear, black-box, and at risk of "hallucination." These trends mean that traditional tools for judging confidence in an image (visual quality assessment, point-spread functions (PSFs), g-factor maps, etc.) are less helpful than before. This paper describes and evaluates an approach that can help with assessing confidence in images produced by arbitrary nonlinear methods. THEORY AND METHODS: We propose to characterize nonlinear methods by examining the images they produce before and after applying controlled perturbations to the measured data. This results in functions known as local perturbation responses (LPRs) that can provide useful insight into sensitivity, spatial resolution, and aliasing characteristics. LPRs can be viewed as generalizations of classical PSFs, and are are very flexible-they can be applied to arbitary nonlinear methods and arbitrary datasets across a range of different reconstruction, denoising, and parameter mapping applications. Importantly, LPRs do not require a ground truth image. RESULTS: Impulse-based and checkerboard-pattern LPRs are demonstrated in image reconstruction and denoising scenarios. We observe that these LPRs provide insights into spatial resolution, signal leakage, and aliasing that are not available with other methods. We also observe that popular reference-based image quality metrics (eg, mean-squared error and structural similarity) do not always correlate with good LPR characteristics. CONCLUSIONS: LPRs are a useful tool that can be used to characterize and assess confidence in nonlinear MR methods, and provide insights that are distinct from and complementary to existing quality assessments.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。