There is no definitive test to diagnose a person with Alzheimer's, but research published this week in the journal Neurology is giving scientists a tool for early recognition of the disease.
Healthy subjects with an average age of 73 were studied over a period of 10 years at the University of Kentucky. They were asked each year to simply report any memory changes they felt.
It turns out self-reporting is key to early detection. Subjects who first reported small signs of forgetfulness were about three times more likely to later develop dementia than those who did not have signs of forgetfulness. Cognitive impairment in the affected individuals began at nine years into the study, and a diagnosis of dementia could be given at 12 years.
This development gives scientists a chance to detect memory issues earlier, even if larger issues aren't showing on cognitive tests. The study's lead author, Richard Kryscio, says, "Right now we are catching this in the mid-stage or when people already have Alzheimer's, and we don't have a lot of tools in our arsenal yet to help you."
It's important to remember that is a fine line between dementia and normal aging. According to geriatrician Dr. Thomas Loepfe, "Age is the biggest risk factor for forgetfulness."
That means forgetting where you placed your phone, or that you had scheduled a last-minute meeting could be completely normal behavior. Half of the subjects of this study reported memory changes, but only about 17% were eventually diagnosed with dementia.
Researchers want to take this new indicator in stride, though. Rebecca Amariglio, a neurology professor at Harvard, says "We’re never going to cure or delay Alzheimer’s if we don’t start to identify what goes on early in the disease. Early intervention is where things are headed."
According to the Alzheimer's Association, Alzheimer's disease, the most common form of dementia, is the sixth-leading cause of death for people in the United States.