Abstract
Studying biological processes across multi-millimeter scales requires imaging systems that combine high spatial resolution with a large field of view (FOV). However, optical aberrations degrade image quality, particularly in large-FOV systems where distortions gradually worsen toward the periphery. Existing methods for correcting field-dependent aberrations are limited and often impractical. Hardware-based solutions such as adaptive optics demand additional components and sample manipulation, increasing system complexity and experimental burden. Computational approaches, while promising, typically rely on exhaustive point spread function (PSF) calibration or large training datasets, restricting their applicability. We introduce a self-supervised algorithm that simultaneously estimates spatially varying aberrations and reconstructs 3D sample structure from a single blurred image-without PSF calibration or training data. Our method was implemented on optically cleared mouse hippocampal neurons and cortical vasculature imaged with oblique plane microscopy (OPM), as well as in vivo mouse retinal vasculature captured with adaptive optics scanning laser ophthalmoscopy (AOSLO), where PSF calibration is infeasible. The algorithm predicts and corrects aberrations across multi-millimeter FOVs, enhances contrast in low signal-to-noise regions, and reveals critical structural details in neuronal and vascular systems that are obscured in raw images. This approach enables high-resolution, large-scale imaging without hardware modifications, expanding the accessibility of advanced microscopy techniques.