Abstract
Breast cancer remains the most common cancer type among women, with invasive ductal carcinoma (IDC) responsible for almost 80% of cases. The exact histopathological segmentation of IDC is the premise of diagnosis, but manual observation of hematoxylin and eosin (H&E) stained slides is very time-consuming and results in interobserver variability. This work presents an automated IDC segmentation method with a lightweight hybrid deep learning framework by integrating U-Net with a MobileNetV2 encoder and a label propagation refinement module. This hybrid model leverages MobileNetV2's efficient depth-wise-separable convolutions for feature extraction, U-Net's encoder-decoder precision for boundary localization, and the label propagation step enhances spatial smoothness and anatomical consistency. Experiments are conducted on the BACH 2018 and BreakHis datasets at multiple magnification levels (40×, 100×, and 200×). The model reaches a precision of 94.85%, Dice coefficient of 94.63%, F1-score of 94.56%, and AUC of 94.65% on the BACH dataset and a precision of 93.87%, Dice of 94.24%, F1-score of 94.18%, and AUC of 93.93% on the BreakHis dataset. The proposed model surpasses several state-of-the-art techniques such as CNN and transformer-based models, including DeepLabV3, Mask R-CNN, Swin-UNet, and ViT-Histo. Cross-dataset validation yields a Dice of 92.10% and AUC of 93.70% from BACH → BreakHis, confirming robustness under domain shifts. Explainable AI analyses using Grad-CAM and SHAP confirmed accurate localization of diagnostically relevant regions. The proposed hybrid model of MobileNetV2 + U-Net with label propagation presents a computationally efficient and clinically reliable solution toward real-time, AI-assisted breast cancer histopathology.