HYPER-Net: Physics-Conditioned Self-Supervised Reconstruction for Fourier Light-Field Microscopy

HYPER-Net:基于物理条件的傅里叶光场显微镜自监督重建

阅读:2

Abstract

The rapid convergence of optical innovation and machine intelligence is reshaping biological imaging by enabling platforms that jointly advance image formation and computational reconstruction for highspeed, high-resolution volumetric microscopy. However, broadly accessible three-dimensional imaging at high spatiotemporal resolution remains limited by the reliance of existing supervised methods on large modality-matched training datasets, the computational burden of conventional iterative reconstruction, and sensitivity to optical mismatch arising from small deviations in the spatial-angular point spread functions. Here, we introduce HYPER-Net, a physics-conditioned self-supervised framework for Fourier light-field microscopy that integrates scan-free volumetric acquisition with fast, robust three-dimensional reconstruction. HYPER-Net incorporates experiment-specific point-spread functions into the learning process in two complementary roles: as the forward operator that enforces measurement consistency and as a conditioning signal that adaptively modulates intermediate feature representations. This design reduces reliance on paired experimental ground-truth volumes, improves robustness to system variation, and enables generalizable reconstruction across diverse biological contexts. Using human colon organoids, embryonic Xenopus laevis hearts, hiPSC-derived cardiac spheroids, and freely moving Caenorhabditis elegans , we demonstrate high-fidelity volumetric imaging of tissue morphology, cardiac function, calcium-contraction coupling, and locomotion-associated neural and muscular dynamics. These results position HYPER-Net as a versatile framework for rapid volumetric imaging and quantitative analysis of dynamic biological systems across basic research and biomedical applications.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。