Abstract
OBJECTIVE: This study aimed to develop and validate a predictive model for cancer therapy-related cardiac dysfunction (CTRCD) in breast cancer patients undergoing chemotherapy, targeted therapy, or immunotherapy. METHODS: A retrospective analysis was conducted on 506 patients treated at Hunan Provincial People's Hospital (2018-2023). RESULTS: Clinical and imaging biomarkers, including NT-proBNP (P < 0.001), left ventricular ejection fraction (LVEF; P = 0.003), and left atrial diameter (LA; P = 0.012), were evaluated. Lasso-Cox regression identified eight significant predictors (all P < 0.05), which were incorporated into a nomogram. The model exhibited excellent discrimination in both the training (AUC 0.82, 95% CI 0.78-0.86) and validation cohorts (AUC 0.79, 95% CI 0.74-0.83). Time-dependent ROC curves demonstrated consistent predictive accuracy at 4 weeks (AUC 0.80, P < 0.001), 8 weeks (AUC 0.81, P < 0.001), and 12 weeks (AUC 0.79, P = 0.002). Calibration curves indicated good agreement (Hosmer-Lemeshow test P = 0.34), and decision curve analysis confirmed the model's clinical utility (net benefit > 15% across threshold probabilities). CONCLUSION: This validated tool facilitates early CTRCD risk stratification (C-index 0.80, P < 0.001), supporting personalized monitoring of cardiotoxicity.