Abstract
Computational models of the MLG1 neurons in crab Neohelice granulata have been developed to detect and spatially localize looming stimuli. However, existing models suffer from significant performance degradation in dim scenarios, primarily due to visual signal corruption from stochastic noise such as photon shot noise. To address this challenge, we propose a computational framework that embeds Daubechies wavelet directly into ON/OFF visual pathways. The ON/OFF mechanism separates the input signals in parallel based on luminance changes to capture dynamic differences between target and background. Embedding Daubechies wavelet enables multi-scale frequency decomposition, allowing the model to suppress high-frequency noise while enhancing low-frequency looming trends. This process extracts low-frequency components and high-frequency details, providing the MLG1 neuron with more discriminative feature inputs. Experimental results demonstrate that the model achieves reliable looming spatial localization under extremely low contrast conditions, offering a robust methodology for bionic vision in extreme dim light environments.