DRIPS: Domain Randomisation for Image-based Perivascular spaces Segmentation

DRIPS:基于图像的血管周围间隙分割的域随机化

阅读:1

Abstract

Perivascular spaces (PVS) are emerging as sensitive imaging markers of brain health. Yet, accurate out-of-sample PVS segmentation remains challenging since existing methods are modality-specific, require dataset-specific tuning, or rely on manual labels for (re-)training. We propose DRIPS (Domain Randomisation for Image-based PVS Segmentation), a physics-inspired framework that integrates anatomical and shape priors with a physics-based image generation process to produce synthetic brain images and labels for on-the-fly deep learning model training. By introducing variability through resampling, geometric and intensity transformations, and simulated artefacts, it generalises well to real-world data. We evaluated DRIPS on MRI data from five cohorts spanning diverse health conditions (N = 165; T1w and T2w, isotropic and anisotropic imaging) and on a 3D ex vivo brain model reconstructed from histology. We evaluated its performance using the area under the precision-recall curve (AUPRC) and Dice similarity coefficient (DSC) against manual segmentations and compared it with classical and deep learning methods, including Frangi, RORPO, SHIVA-PVS, and nnU-Net. Only DRIPS and Frangi achieved AUPRC values above chance across all cohorts and the ex vivo model. On isotropic data, DRIPS and nnU-Net performed comparably, outperforming the next-best method by a median of +0.17-0.39 AUPRC and +0.09-0.26 DSC. On anisotropic data, DRIPS outperformed all competitors by a median of +0.13-0.22 AUPRC and +0.07-0.14 DSC. Importantly, its performance was not associated with white matter hyperintensity burden. DRIPS delivers accurate, fully automated PVS segmentation across heterogeneous imaging settings, reducing the need for manual labels, modality-specific models, or cohort-dependent tuning.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。