Abstract
Automated recognition of handwritten text on bank cheques is crucial for streamlining financial transactions and reducing manual errors. However, traditional systems often encounter two significant challenges: differentiating overlapping handwritten and printed text and restoring faded or partially damaged handwriting. This paper presents a multi-stage deep learning framework that effectively addresses these issues to extract essential cheque fields accurately. The process begins with preprocessing to remove noise and enhance the cheque image. Overlapping regions of handwritten and printed text are identified and segmented using a hybrid approach that combines pseudo letters with height-based segmentation and intelligent character recognition. Key handwritten elements such as the date, signature, name, and amount are then extracted using a fine-tuned Nanonet model. For severely faded handwriting, a restoration stage employing edge detection, contour construction, and texture inpainting is implemented. Finally, a novel sigmoidal growing cosine intermap pooling-based convolutional neural network is used to classify cheques as genuine or forged, achieving a classification accuracy of 98.79%. Experimental results demonstrate that the proposed framework outperforms existing methods in terms of segmentation accuracy, recognition robustness, and classification performance.