Abstract
INTRODUCTION: Breast tumors, predominantly benign, are a global health concern affecting women. Vacuum-assisted biopsysystems (VABB) guided by ultrasound are widely used forminimally invasive resection, but their reliance on surgeon experience and positioning challenges hinder adoption in primary healthcare settings. Existing AI solutions often focus on static ultrasound image analysis, failing to meet real-time surgical demands. METHODS: This study presents a real-time positioning system for breast tumor rotational resection based on an optimized YOLOv11n architecture to enhance surgical navigation accuracy. Ultrasound video data from 167 patients (116 for training, 33 for validation, and 18 for testing) were collected to train the model. The model's architecture was optimized across three major components: backbone, neck, and detection head. Key innovations include integrating MobileNetV4 Inverted Residual Block and MobileNetV4 Universal Inverted Bottleneck Block to reduce model parameters and computational load while improving inference efficiency. RESULTS: Compared with the baseline YOLOv11n, the optimized YOLOv11n+ model achieves a 17.1% reduction in parameters and a 27.0% reduction in FLOPS, increasing mAP50 for cutter slot and tumor detection by 2.1%. Two clinical positioning algorithms (Surgical Method 1 and Surgical Method 2) were developed to accommodate diverse surgical workflows. The system comprises a deep neural network for target recognition and a real-time visualization module, enabling millisecond-level tracking, precise annotation, and intelligent prompts for optimal resection timing. CONCLUSION: These research findings provide technical support for minimally invasive breast tumor resection, holding the promise of reducing reliance on surgical experience and thereby facilitating the application of this technique in primary healthcare institutions.