Abstract
The hemoglobin-to-red blood cell distribution width ratio (HRR) has emerged as a potential predictor of various health outcomes. This study aimed to investigate the association between HRR and all-cause, cancer, and cardiovascular mortality. This cohort study used data from 28,825 participants in the 1999-2018 U.S. National Health and Nutrition Examination Survey. Weighted Cox regression was used to assess the associations between HRR and mortality. Restricted cubic spline (RCS) models evaluated the non-linear associations between HRR and mortality risk. Subgroup and sensitivity analyses were conducted to assess the robustness of the study results. Trend tests assessed the temporal trends of mean HRR. Lower HRR was significantly linked to increased risks of all-cause, cancer, and cardiovascular mortality. According to the fully adjusted model, the highest quintile of HRR (Q5) showed lower mortality risks compared to the lowest quintile (Q1): all-cause mortality (HR 0.47, 95% CI 0.40, 0.55), cancer mortality (HR 0.51, 95% CI 0.37, 0.71), and cardiovascular mortality (HR 0.43, 95% CI 0.32, 0.56). A significant trend effect was observed across HRR quintiles (P for trend < 0.0001). Nonlinear association analyses suggested a linear relationship between HRR and cardiovascular mortality, while "L"-shaped associations were observed for all-cause and cancer mortality. Notably, the mean HRR levels decreased from 1.18 (95% CI 1.16-1.19) in 1999-2000 to 1.07 (95% CI 1.05-1.08) in 2017-2018. An inverse association between HRR and mortality risk was found, with lower HRR levels indicating higher mortality risk. Over the past two decades (1999-2018), there has been a significant decline in HRR levels among U.S. adults. HRR may serve as a valuable and easily obtainable predictor for mortality risk assessment in clinical practice.