Synthesis of MR fingerprinting information from magnitude-only MR imaging data using a parallelized, multi network U-Net convolutional neural network

利用并行化的多网络 U-Net 卷积神经网络,从仅包含幅度信息的磁共振成像数据中合成磁共振指纹信息

阅读:2

Abstract

BACKGROUND: MR fingerprinting (MRF) is a novel method for quantitative assessment of in vivo MR relaxometry that has shown high precision and accuracy. However, the method requires data acquisition using customized, complex acquisition strategies and dedicated post processing methods thereby limiting its widespread application. OBJECTIVE: To develop a deep learning (DL) network for synthesizing MRF signals from conventional magnitude-only MR imaging data and to compare the results to the actual MRF signal acquired. METHODS: A U-Net DL network was developed to synthesize MRF signals from magnitude-only 3D T (1)-weighted brain MRI data acquired from 37 volunteers aged between 21 and 62 years of age. Network performance was evaluated by comparison of the relaxometry data (T (1), T (2)) generated from dictionary matching of the deep learning synthesized and actual MRF data from 47 segmented anatomic regions. Clustered bootstrapping involving 10,000 bootstraps followed by calculation of the concordance correlation coefficient were performed for both T (1) and T (2) MRF data pairs. 95% confidence limits and the mean difference between true and DL relaxometry values were also calculated. RESULTS: The concordance correlation coefficient (and 95% confidence limits) for T (1) and T (2) MRF data pairs over the 47 anatomic segments were 0.8793 (0.8136-0.9383) and 0.9078 (0.8981-0.9145) respectively. The mean difference (and 95% confidence limits) were 48.23 (23.0-77.3) s and 2.02 (-1.4 to 4.8) s. CONCLUSION: It is possible to synthesize MRF signals from MRI data using a DL network, thereby creating the potential for performing quantitative relaxometry assessment without the need for a dedicated MRF pulse sequence.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。