RISNet: A variable multi-modal image feature fusion adversarial neural network for generating specific dMRI images

RISNet:一种用于生成特定dMRI图像的可变多模态图像特征融合对抗神经网络

阅读:1

Abstract

The b-value in the diffusion magnetic resonance image(dMRI) reflects the degree to which the water molecules are affected by the magnetic field gradient pulse in the tissue, and the different b-values not only affect the image contrast but also the accuracy of the subsequent calculation. The imbalance between the lower and higher b-value image categories in the macaque dMRI brain imaging dataset dramatically affects the accuracy of computational neuroscience. The medical image conversion method based on the generative adversarial network can generate different b-value images. However, the macaque brain dataset has multi-center and small-sample problems, which restricts the training effect of the general model. To increase macaques' lower b-value dMRI data, we propose a variable multi-modal image feature fusion adversarial neural network called RISNet. The network can use the proposed rapid insertion structural(RIS) to input features from different modes into a general residual decoding structure to enhance the model's generalization ability. The RIS combines the advantages of multi-modal data, which can quickly rewrite the network and extract and fuse the features of multi-modal data. We used a T1 image and a higher b-value image of the brain as model inputs to generate high-quality, lower b-value images. Experimental results show that our method improves the PSNR index by 1.8211 on average and the SSIM index by 0.0111 compared with other methods. In addition, in terms of qualitative observation and DTI estimation, our process also shows sound visual effects and strong generalization ability. These advantages make our method an effective means to solve the problem of dMRI brain image conversion in macaques and provide strong support for future neuroscience research.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。