Continuous sPatial-temporal deformable image registration and 4D frame interpolation

连续时空可变形图像配准和4D帧插值

阅读:2

Abstract

BACKGROUND: Deformable image registration (DIR) is a crucial tool in radiotherapy for analyzing anatomical changes and motion patterns. Current DIR implementations rely on discrete volumetric motion representation, which often leads to compromised accuracy and uncertainty when handling significant anatomical changes and sliding boundaries. This limitation affects the reliability of subsequent contour propagation and dose accumulation procedures, particularly in regions with complex anatomical interfaces such as the lung-chest wall boundary. PURPOSE: Given that organ motion is inherently a continuous process in both space and time, we aimed to develop a model that preserves these fundamental properties. Drawing inspiration from fluid mechanics, we propose a novel approach using implicit neural representation (INR) for continuous modeling of patient anatomical motion. This approach ensures spatial and temporal continuity while effectively unifying Eulerian and Lagrangian specifications to enable natural continuous motion modeling and frame interpolation. The integration of these specifications provides a more comprehensive understanding of anatomical deformation patterns. METHODS: We propose an INR-based approach modeling motion continuously in both space and time, named continues-sPatial-temporal deformable image registration (CPT-DIR). This method fits a multilayer perception network to map the 3D coordinate (x, y, z) , to its corresponding velocity vector (vx, vy, vz) . Displacement vectors (▵x, ▵y, ▵z) are then calculated by integrating velocity vectors over time using an Euler method numerical scheme. The above spatial and temporal continuous motion design also enables continuous frame interpolation (CPT-Interp). The DIR's and interpolation's performance were tested on the DIR-Lab dataset and the Abdominal-DIR-QA dataset, using metrics of landmark accuracy (target registration error), contour conformity (Dice), and image similarity (mean absolute error). RESULTS: CPT-DIR clearly reduced landmark TRE from 2.79 ± 1.88 to 0.99 ± 1.07 mm over DIRLab and from 8.61 ± 7.92 to 4.79 ± 6.28 mm over the challenging Abdominal-DIR-QA dataset, surpassing B-spline results across all cases. The whole-body region MAE improved from 35.46 ± 46.99 to 28.99 ± 32.70 HU for DIRLab, and from 37.32 ± 18.69 to 20.65 ± 16.39 HU for Abdominal-DIR-QA. In the challenging sliding boundary region, CPT-DIR demonstrated superior performance compared to B-spline, reducing ribcage MAE from 75.40 ± 86.70 HU (unregistered) to 42.04 ± 45.60 HU and improving Dice coefficients from 89.30% to 90.56% . The training-free CPT-Interp method enhanced previous deep learning-based approaches, improving upon UVI-Net with reduced MAE ( 17.88 ± 3.79 vs. 18.93 ± 3.90 ) and increased peak signal-to-noise ratio (PSNR) ( 40.26 ± 1.58 vs. 39.76 ± 1.48 ), while eliminating training dataset dependencies. Both CPT-DIR and CPT-Interp achieved substantial computational efficiency, completing operations in under 3 s compared to several minutes required by conventional B-spline methods. CONCLUSION: By leveraging the continuous representations, the CPT-DIR method enhances registration and interpolation accuracy, automation, and speed. The method achieves high accuracy on intra-fractional thoracic datasets and demonstrates improved performance over conventional methods in more challenging inter-fractional abdominal registration scenarios, highlighting its potential for robust applications in radiotherapy. The improved efficiency and accuracy of CPT-DIR make it particularly suitable for real-time adaptive radiotherapy applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。