Abstract
While self-rated health (SRH) has long been known to predict mortality in adult populations, the age of respondents plays an interesting and complex role in both explaining and modifying the association. The objective of this study is to test for differences by age in the association of SRH with all-cause mortality. Because much of the research has been conducted with older samples, a wider age range of adults may show that some age groups have more predictive SRH than others. We estimated Cox proportional hazards models to determine if SRH in 1999 predicted survival to 2021 differently based on age, using data from the Panel Study of Income Dynamics. The sample consisted of 5843 respondents aged 25 to 97 who were interviewed in 1999 and followed for survival until 2021. We included demographic and socioeconomic factors, physical health and mental health indicators, and health risk behaviors as covariates to assess their potential mediating role in the predictive ability of SRH. The results showed a significant interaction between SRH and age, with larger and more significant hazards for those aged 40-54 and 55-74. There were no significant effects at all for the youngest group and virtually none for the oldest group. For example, for individuals aged 40-54, there were significant HRs for poor health (2.49, 95% CI 1.05, 5.89) and fair health (1.95, 95% CI 1.11, 3.42) compared to excellent health in the fully adjusted models. Our findings suggest that age group differences in the predictiveness of SRH may reflect an absence of health knowledge and experience for younger respondents, and a survivor bias for the oldest age group due to the lifetime elimination of those with poor health.