The “terminal decline hypothesis” states that a decline in cognitive performance precedes death in most elderly people. A new study from Sweden investigates terminal decline and tries to identify cognitive precursors of death in two representative samples.
For both groups, there was a gradual decline in test performance as individuals aged (see image below) Also, in both groups, people with better test performance lived longer. The higher death rate in less intelligent people is consistent with past research (and in other studies is not limited to old people).
What’s interesting is the differences in the two groups. The older group had a higher risk of death at every age, as shown in the graph below. Also, lower overall performance in the older group was a good predictor of death. But in the younger group, the rate of decline was a better predictor of death than the lower overall performance.
These results tell us a lot about cognitive aging and death. First, it’s another example of higher IQ being better than lower IQ. Second, it shows that it is possible to alter the relationship between cognitive test performance and death. The younger group had better health care and more education, and this may be why their decline was more important than their overall IQ in predicting death (though these results control for education level and sex). Finally, the data from this study can be used to better predict which old people are most at risk of dying within the next few years. It’s nice to have both theoretical and practical implications from a study!
This is fascinating—the fact that cognitive decline rate became more predictive than absolute IQ level in the later cohort suggests environmental improvements (healthcare, education, nutrition) might be raising the floor for everyone. So now it’s not just “are you smart?” but “how fast are you declining?” that matters for mortality risk. Makes sense that if you eliminate early deaths from preventable causes, cognitive aging patterns become more visible.
The survival curves are striking—people born in 1901-07 had way steeper mortality starting around age 80 compared to the 1930 cohort. What gets me is that lower IQ still predicts earlier death even when controlling for education and health. The mechanism is probably complex—smarter people might make better health decisions, have less risky jobs, access healthcare more effectively, or just have better overall brain health. Either way, cognitive ability clearly matters for longevity beyond just what job you can get.
It’s tempting to say this shows “higher IQ being better,” but since this is an association study, the overall cognitive level and the rate of decline might just be two different expressions of an underlying biological integrity or lack thereof that causes both the cognitive pattern and the mortality. Does the study propose that actively slowing cognitive decline in the younger group would reduce their mortality risk, or is the decline simply a good, early symptom of a biological clock running out? The causal direction remains the biggest question.
@CloverL Your point about the steep mortality curve in the older cohort (1901-07) is key, and it links back to their lower average cognitive performance being the strongest predictor of death. However, we have to remember the study’s other finding: for the 1930 cohort, the rate of cognitive decline, not the overall level, was the better predictor. This implies that once a baseline standard of health and cognitive ability is reached (as in the 1930 cohort), it’s the pathological change (the decline) that signals impending death, not the initial ability. The protective effect of high IQ only goes so far once terminal decline begins.
The study shows that cognitive decline predicts mortality, but does it explain why? Is cognitive decline actually causing death, or are both the decline and the mortality just different symptoms of the same underlying physiological deterioration? If decline is just a biomarker rather than a cause, what exactly is it a biomarker of?
Who actually participated in cognitive testing, and did that differ by cohort? Cognitive testing in old age requires being physically well enough to attend, cognitively aware enough to consent, and not being institutionalized or hospitalized. If the younger cohort’s better healthcare meant more frail or declining individuals were still living independently and could participate in testing, while the older cohort’s frail members had already died or been institutionalized and thus excluded, we’d see exactly this pattern, because we’re capturing different slices of each population.