Abstract
Spiking Neural Networks (SNNs), designed to more accurately model the brain's neurobiological processes, have been proposed as energy-efficient alternatives to conventional Artificial Neural Networks (ANNs), which typically incur high computational and energy costs. However, the enhanced energy efficiency and computational savings incurred by using SNNs are often achieved at the expense of reduced classification performance. Recent studies have investigated the incorporation of attention mechanisms into SNNs to enhance their classification performance, but these approaches typically repurpose attention mechanisms originally developed for conventional ANNs, which fail to fully leverage the spike-based encoding characteristics intrinsic to spiking neuron dynamics. To address this challenge, we propose the Biologically Inspired Attention Spiking Neural Network (BIASNN), a novel SNN architecture designed for image classification. BIASNN introduces a biologically inspired attention mechanism that integrates adaptive leaky integrate and fire neurons with components from established attention models. Our attention mechanism is placed into an existing SNN architecture using leaky integrate and fire neurons to enhance biological fidelity by combining multiple spiking neuron models in a single network. Experiments on benchmark image classification datasets demonstrate that BIASNN achieves high classification accuracy using only four timesteps. By enabling the development of more biologically plausible attention mechanisms, BIASNN advances the capabilities of deep spiking neural networks toward more brain-like processing.