Abstract
OBJECTIVES: This study aims to evaluate the relationship between changes in thyroid stimulating hormone receptor antibody (TRAb) levels before and after treatment and the efficacy of radioactive iodine therapy (RAI) for Graves' disease (GD). Additionally, a decision tree model was developed to predict treatment outcomes based on variations in serum TRAb levels. METHODS: A total of 728 patients were evaluated to investigate the association between TRAb level fluctuations and RAI treatment efficacy. A decision tree model was constructed using TRAb level changes at 3 and 6 months post-RAI to predict clinical outcomes. RESULTS: Among the 728 patients, 326 (44.8%) achieved clinical remission. Patients with lower TRAb levels at 18 months post-RAI exhibited higher remission rates, particularly those whose TRAb levels returned to baseline. A greater decline in TRAb levels between 18 and 36 months post-RAI was also correlated with improved treatment outcomes. Decision tree models based on TRAb level changes at the 3rd and the 6th month post-RAI demonstrated predictive accuracies of 74.36% and 72.46%, respectively. Further analysis showed that patients with minimal TRAb elevation in the early phase after RAI had a higher likelihood of achieving remission. CONCLUSIONS: Decision tree modeling identified early TRAb elevation patterns that serve as strong predictors of RAI efficacy. Incorporating TRAb fluctuations into clinical assessment may facilitate personalized (131)I dosing strategies, ultimately improving treatment outcomes for GD patients.