Abstract
Breast cancer (BC) remains a leading cause of cancer-related mortality among women worldwide, necessitating advancements in diagnostic methodologies to improve early detection and treatment outcomes. This study proposes a novel twin-stream approach for histopathological image classification, utilizing both histopathologically inherited and vision-based features to enhance diagnostic precision. The first stream utilizes Virchow2, a deep learning model designed to extract high-level histopathological features, while the second stream employs Nomic, a vision-based transformer model, to capture spatial and contextual information. The fusion of these streams ensures a comprehensive feature representation, enabling the model to achieve state-of-the-art performance on the BACH dataset. Experimental results demonstrate the superiority of the twin-stream approach, with a mean accuracy of 98.60% and specificity of 99.07%, significantly outperforming single-stream methods and related studies. Statistical analyses, including paired t-tests, ANOVA, and correlation studies, confirm the robustness and reliability of the model. The proposed approach not only improves diagnostic accuracy but also offers a scalable and efficient solution for clinical applications, addressing the challenges of resource constraints and increasing diagnostic demands.