Abstract
Spatial Phase Prediction from intensity image without auxiliary sensors is a challenging task due to the compressed phase information and the increase in complexity with varying optical systems. Many advanced works focus primarily on the adaptability to extended light sources, while having yet to construct a feature map that is robust and sufficient enough to represent the characteristics of the aberration, leading to some problems, such as a minimum light intensity limit, a small input size, and failing to remain optimal with different phase ranges. To tackle these problems, we apply both the Fourier transform and the multi-scale wavelet transform in the feature mapping process, constructing a high-frequency concentrated aberration feature sequence (AFS). To better adapt to the learning complexity, we formulate an effective and lightweight model, named aberration sensing convolutional neural network (ASCNN), under the supervised framework that allows efficient online computation. The results demonstrate that compared to the baseline feature, the performance of the proposed AFS improves by around 30%. Besides, the model ASCNN achieves high performance in aberration retrieval, with an average RMSE better than 0.0143λ under the initial condition of ±0.25λ, and the SSIM exceeding 96% between the reconstructed and ground-truth phases. Furthermore, the proposed method is evaluated on both scalar diffraction and the polarized SLM system to demonstrate the robustness and the broad potential for applications.