Abstract
BACKGROUND: Limited-angle cone-beam CT (LA-CBCT) reduces imaging time and dose but suffers from severe under-sampling artifacts and distortions. 2D-3D deformable registration mitigates this issue by estimating LA-CBCTs through the deformation of a prior, fully-sampled CT/CBCT, using deformation-vector-fields (DVFs) optimized by limited-angle cone-beam projections. Population-trained 2D-3D registration networks enable fast inference but face accuracy challenges, particularly under varying limited-angle scan directions. On the other hand, patient-specific models are more adaptable but typically require considerable runtimes to optimize model parameters from scratch for each case. PURPOSE: To improve the accuracy and efficiency of 2D-3D registration-driven LA-CBCT estimation, a hybrid 2D-3D deformable registration framework was proposed. METHODS: The hybrid population-based and patient-specific 2D-3D deformable registration framework (HB-2D3DReg) synergized the advantages of both population-based and patient-specific approaches while mitigating their limitations. It integrated the fast inference of population-trained models with the test-time adaptability of patient-specific models through a two-stage approach. First, a population-based 2D-3D registration network, 2D3D-RegNet, was trained on a cohort dataset in an unsupervised manner, with a similarity loss defined between digitally reconstructed radiographs (DRRs) of the estimated LA-CBCTs and limited-angle 2D projections. Then, a 2D-3D registration network based on implicit neural representation (INR), 2D3D-INR, refined the DVFs solved by the population-based model during test time for each independent testing case. The population-based 2D3D-RegNet accelerated the optimization of the patient-specific 2D3D-INR and reduced the latter's chance of being trapped at a local optimum, while the patient-specific network, in turn, enhanced the accuracy of the population-based model. HB-2D3DReg was evaluated using a dataset of 48 4D-CTs, 26 of which were used to train the population-based model and 22 for testing. Different limited-angle scan scenarios, featuring varying scan directions and angles, were assessed. RESULTS: HB-2D3DReg attained superior LA-CBCT estimation and registration accuracy. Under an orthogonal-view 90° scan (45° each) with varying scan directions, HB-2D3DReg achieved mean (± S.D.) image relative error of 7.99 ± 2.16% and target registration error of 3.70 ± 1.94 mm, compared to 15.40 ± 2.41% and 8.52 ± 3.31 mm (no registration), 9.82 ± 2.12% and 6.38 ± 2.46 mm (2D3D-RegNet only), and 9.71 ± 2.33% and 5.01 ± 2.77 mm (2D3D-INR only) on the DIR-lab dataset. HB-2D3DReg took ∼3 min to optimize at test time, compared to 13 min for the 2D3D-INR method. CONCLUSION: HB-2D3DReg achieved accurate and robust 2D-3D deformation registration for LA-CBCT estimation, enabling efficient anatomy monitoring to guide radiotherapy. The code will be released at: https://github.com/sanny1226/HB-2D3DReg.