Daniel Gibbs practiced as a neurologist for 25 years in Portland, Oregon. After years of giving patients the devastating news that they had Alzheimer's disease, he began to suspect he might have it himself.
He had trouble remembering neighbors' names and kept forgetting his new clinic's address. He quietly asked a colleague to run some cognitive tests, then retired in 2013 because he didn't want any of his lapses to harm his patients.
Two years later, he was diagnosed with early-stage Alzheimer's disease. "It was actually kind of a relief," he said.
Gibbs, 68, enrolled in a study for a drug called aducanumab, developed by Biogen. The pharmaceutical company had just revealed stunning results from an initial test in people with memory problems. The medicine scrubbed the brain of a sticky plaque long thought to be the cause of Alzheimer's disease.
It seemed to slow cognitive decline in some patients, and as news stories hyped its promise, Biogen stock soared.
Gibbs was hopeful. Every month for a year and a half, he flew to San Francisco for an infusion of either the drug or a placebo. "I'm very high about it," Gibbs said in late 2018, while the study was still gathering data. "I think it has a good chance of being successful."
At the time, Gibbs was one of tens of thousands of people who had agreed to take experimental drugs for Alzheimer's, hoping to stave off their slide into full-blown dementia. Except for a few drugs that temporarily curtailed symptoms, no medicine had worked.
Drug studies for Alzheimer's disease were long shots because the causes of neurodegeneration were so murky. Studies had among the highest failure rates of any condition.
Even today — after 40 years and billions of dollars — researchers still can't agree on what it is. "I don't think anybody thought it would take this long and be this hard," said Eric Siemers, who retired from Eli Lilly in 2017 after 20 years trying to create a drug for Alzheimer's.
Researchers have tried to slow the erosion of memory with everything from estrogen replacement to anti-inflammatory pills and ginkgo biloba. They've tried new drugs to boost neurotransmitters and slash cortisol, a hormone released in response to stress.
Most drugs, though, have targeted the "amyloid plaques" that develop in the brains of many people as they age. Now, evidence is mounting that these plaques are not the cause of Alzheimer's disease, a worrisome possibility after decades of research.
A handful of neurologists and leaders at the newly formed National Institute on Aging (NIA) sent researchers down this narrow path in the 1970s. They argued that old-age mental decline was the same as a rare neurodegenerative disease of middle age — Alzheimer's disease.
They told Congress and the public that with enough money, they would soon find a cure. Genetic clues from these middle-age Alzheimer's patients seemed to point to a single molecule: the protein in plaques called "amyloid beta."
Research became dominated by the theory that amyloid beta causes Alzheimer's. In fact, through the '90s and early 2000s, grant money overwhelmingly flowed to studying it, effectively stifling alternative theories.
Pharmaceutical companies poured billions of dollars into detecting amyloid beta in spinal fluid and brain scans and creating drugs to stop it from building up in brains. But brain scans revealed an inconvenient truth — dementia doesn't track closely with amyloid beta. And the drugs have failed to slow cognitive decline in clinical trials.
"Every major pharmaceutical company put money into the amyloid idea, and they all failed because the idea was flawed," said Zaven Khachaturian, a former director of Alzheimer's research at the NIA.
"It became gradually an infallible belief system. So, everybody felt obligated to pay homage to the idea without questioning. And that's not very healthy for science when scientists ... accept an idea as infallible. That's when you run into problems," he said.
The disappointment is strong because, for years, the promises were so big.
Senility rebranded as Alzheimer's disease
The definition of Alzheimer's disease — as we understand it today — goes back to a fledgling agency, created in the 1970s, called the National Institute on Aging in the National Institutes of Health. Khachaturian, a neurologist, was one of its first employees and was struggling to recruit scientists to study the aging brain.
"The idea of doing aging research was considered a bit of a joke," recalled Khachaturian. "It didn't have the legitimacy of doing research in, say, cancer or heart disease."
This was something Khachaturian's boss, Robert Butler, wanted to change.
Butler had been raised by his grandparents on a chicken farm in New Jersey, which Khachaturian said gave him "a love for older individuals" that shaped his career as a psychiatrist and gerontologist. He coined the term "age-ism." His book, "Why Survive? Being Old in America," won the Pulitzer Prize in 1976 for drawing attention to what he called "the tragedy" of old age.
That same year, Butler was named founding director of the National Institute on Aging. He claimed one of those tragedies was confusion and memory loss in older people. Senility at the time was seen as a normal part of aging for some people, almost a phase of life. Doctors attributed it to "hardening of the arteries" in the brain and accepted it.
Butler, though, was intrigued by research that started to challenge that assumption. Scientists claimed many older people with senility had an obscure disease — Alzheimer's disease.
The rare condition was named after a German psychiatrist named Alois Alzheimer, who in 1906 described the peculiar case of a 51-year-old woman with dementia. After she died, Alzheimer peered at slices of her brain under a microscope and saw destroyed neurons, blobs of protein plaque and tangles of tough, thready material. These "plaques and tangles" became the hallmarks of the odd middle-age disease named after him.
For the next 70 years, it was only diagnosed in people under age 65.
In the 1970s a few researchers began to question that age limit. When they autopsied older people with senility, they often — but not always — found the same "plaques and tangles" that Dr. Alzheimer described. Based on these autopsies, they argued that much of senility was really Alzheimer's disease.
"That was a mind-blowing conceptual change," said epidemiologist Lon White, who later led a major study of mental decline in older men in Hawaii.
The expanded definition of Alzheimer's disease reframed cognitive problems in old age: Suddenly millions of older people weren't suffering from inevitable aging. Instead, they were suffering from a specific disease, with the expectation that it could be studied and possibly cured.
Butler picked up this argument. He called Alzheimer's "an epidemic" and sold the public on his vision: Medical research would cure Alzheimer's, just as research had led to eradicating infectious diseases.
"When I appeared before Congress, I would argue that Alzheimer's disease is the polio of geriatrics," Butler told an interviewer in 2008, two years before he died. "And just as we no longer hear the thump-thump of the iron lung ... because we no longer have polio, so, too, I think the day will come when we will no longer have Alzheimer's disease."
There were practical marketing reasons for positioning Alzheimer's disease as a priority. It allowed Butler to attract credibility, scientists, and, most importantly, federal research funding.
Reflecting on his strategy, Butler wrote in 1999 that "the public does not see itself as 'suffering' from the basic biology of aging, nor does it generally believe that aging per se can be reversed."
He concluded that the public only mobilizes around a specific disease.
Recalled Khachaturian: "In order to bring the funding to the NIA, the claim — the headline — was Alzheimer's, and we defined it very broadly. It was just a linguistic kind of thing rather than a clear-cut medical diagnostic, sorting out."
Butler also was inspired by the success of citizen lobbying groups for heart disease and cancer. He helped create what became the Alzheimer's Association to use what he called the "health politics of anguish" to play a similar role raising money for Alzheimer's research. The public began clamoring for funding and some scientists began promising a cure.
George Glenner, a pioneering Alzheimer's researcher at the University of California-San Diego, wrote to the Senate Special Committee on Aging in 1988 that, in part due to the discovery of the protein in Alzheimer's plaques, scientists likely could come up with a drug treatment "by the turn of the century."
In testimony typical for its optimism, Leonard Berg, chairman of the medical advisory board of the Alzheimer's Association, told Congress in 1992 that "a treatment to delay Alzheimer's" was "clearly within our reach" and that there was "a reasonable expectation in the next five to 10 years of some major impact."
As Alzheimer's disease became a household word, its boundaries grew fuzzier. Scientists initially were careful to say that not all seniors with memory loss and thinking problems had Alzheimer's disease.
But to the public, Alzheimer's became interchangeable with senility.
In just over a decade — from the mid-1970s to the late 1980s — Butler, Khachaturian and a handful of neurologists took what had been an obscure diagnosis of middle age and presented it to the public as a major killer and also a crisis that would overwhelm the country when the baby boomers aged.
Politics motivated this expanded definition of Alzheimer's as much as medical research.
Calling senility "Alzheimer's disease" created a rationale for funding the study of cognitive decline in old age. It also created tunnel vision that focused science on the similarities between middle-age Alzheimer's and old-age dementia, specifically those sticky plaques.
Over time, the broad study of mental decline in old age would be constrained by the narrow definition of a disease defined by Alois Alzheimer.
This meant researchers would spend less time seeking clues to dementia in older people who didn't have plaque. And, this initial framing of Alzheimer's downplayed the possible role of heart disease and inflammation. In general, it underestimated the maddening complexity of dementia in old age.
"Dr. Alzheimer looked in his microscope and he saw amyloid and so that's been the definition because that's what he saw!" said Adam Brickman, an associate professor of neuropsychology at Columbia University.
"What blew my mind ... is that the field didn't say, 'Oh, maybe we were wrong. Maybe (the doctor) was wrong. Maybe it's not these plaques and tangles or maybe that's not the whole story.' That hasn't been questioned enough and that just blows my mind."
Gene defects point to a molecule
By 1990, brain aging research was no longer a backwater. The National Institute on Aging funded 15 Alzheimer's research centers at major universities. Scientists developed theories for what destroys the neurons and synapses in Alzheimer's disease: missing neurotransmitters, inflammation, aluminum, glucose deficiency, a slow-moving virus.
The most visible abnormalities — plaques and tangles — became prime suspects.
One camp argued for tangles. Another for plaques. But in the brains of older people ravaged by Alzheimer's, it was impossible to tell precisely what might be directly causing damage and what was merely a byproduct. One researcher compared the task to showing up at a football stadium after the game was over, and then trying to piece together what had happened from the trash on the field and in the bleachers.
The expanding field of genetics seemed to promise a map out of the chaos.
Scientists began looking at families around the world that inherit a rare form of Alzheimer's disease that strikes in middle age. They hoped that finding the gene defect that caused early Alzheimer's would pinpoint the origin of neurodegeneration. Armed with that knowledge, they thought they might be able to create a drug to help millions of people evade Alzheimer's in old age.
Marty Reiswig's extended family was at the center of the Alzheimer's gene hunt in the 1980s. Ralph, his grandfather, was from a big farm family in Oklahoma. He developed Alzheimer's symptoms at around age 50, along with nine of his siblings. They all died young.
When Reiswig was a child, medical staff showed up at a family reunion to draw blood from aunts and uncles. He didn't think much about what it meant until years later. When he was in college, he attended another family reunion and saw relatives in his father's generation starting to show symptoms. They gathered at a pizza parlor and he remembers an uncle struggling to pull his chair out from the table, and nearly falling as he tried to sit down.
"I sort of thought that was odd," said Reiswig, 40. "But as I looked around the table, I just saw fear and anger and sadness. And that's when it really dawned on me. 'Oh my gosh, this Alzheimer's thing that they say runs in our family is really real.'"
By then, researchers had finally found the genetic mutations that cause early-onset Alzheimer's in these unusual families. It was a huge breakthrough. The paper about the first mutation was one of the most cited in 1991. But knowing where in the DNA something goes wrong wasn't the same as being able to fix it.
Reiswig's father developed dementia around age 50. The family lived in Colorado and Reiswig took his father skiing throughout the early stages of his decline. "One time, we were on the chairlift — the first lift of the day — and I said, 'Dad, what's it like to be you right now with Alzheimer's?'" recalled Reiswig. "He didn't think very long, and he just said, 'It's prison.'"
His father died in early 2019. For now, Reiswig has decided not to find out if he carries the gene mutation. There's a 50 percent chance he does, and if he does, there's a 50 percent chance for his children, 11 and 13.
These families' heartbreak, though, provided a vital clue for science.
The challenge for researchers was just how to make sense of it. Different families had different mutations. All the mutations appeared in one of three genes affecting three brain proteins: a big protein and two enzyme proteins that, like scissors, snip the big protein into smaller chunks.
And one of the smaller chunks was a protein fragment called amyloid beta. It turned out that amyloid beta is the very same protein that piles up into the plaques that Dr. Alzheimer saw back in 1906.
The defects strengthened the theory that plaques somehow cause Alzheimer's — what became known as the amyloid hypothesis. This theory came to dominate the direction of drug development from the 90s onward. Suddenly pharmaceutical companies had a target they could attack with a drug.
"The mutations shifted focus onto amyloid plaque," said David Holtzman, a researcher at Washington University in St. Louis, who was involved in creating one of the first drugs to attack amyloid beta. "If you have a genetic cause, that tells you amyloid is central in causing the disease."
Researchers like Lon Schneider at University of Southern California said the initial hope was that by stopping amyloid beta "we could very possibly cure or stop the progression of the illness right in its tracks."
And the discovery was good for securing more research funding.
Khachaturian was elated. "I could go tell Congress saying, 'Look at all the wonderful things we're doing," he recalled. "We discovered the gene. We discovered the molecule and if you remove the molecule, we will solve the disease."
It didn't turn out to be that easy.
Chasing amyloid beta ...
Whoever succeeded in making a drug for Alzheimer's stood to make a fortune.
Pharmaceutical companies were willing to gamble on this unproven idea and raced ahead, betting that removing the "toxic" amyloid beta protein from the brain would slow symptoms of memory loss.
"It was an exciting time," said Siemers of Ely Lilly. The company spent billions on the approach. Others aimed at it, too.
Over two thirds of Alzheimer's drug studies from 2002 to 2012 tested amyloid-bashing drugs. Between 2015 and the end of 2018, more than half of the two dozen drugs tested annually in major studies were focused on amyloid beta.
It took years just to develop drugs to test in clinical trials. Companies tried different approaches and hit dead ends. It was difficult to get drugs small enough to penetrate the brain. In 2008, Eli Lilly became the first big pharmaceutical company to test a pill that attacks one of the enzymes that creates amyloid beta. The theory was simple: disable the enzyme that snips amyloid beta out of the big protein and levels of amyloid beta would drop. But the study was stopped early because volunteers taking the pill were twice as likely to get skin cancer and declined faster on cognitive tests compared to people taking a placebo.
"One of the things about this field is that it makes you humble in a hurry," said Siemers. "It didn't work out the way a lot of us thought it might."
Companies including Ely Lilly, Merck, and Johnson and Johnson developed pills to inhibit a second enzyme, called BACE inhibitors. Two decades after work started on them, the last remaining ones have failed in clinical trials.
In July 2019, Novartis and Amgen abruptly halted a BACE inhibitor study when the drug resulted in faster decline on cognitive tests and more brain atrophy and weight loss. In September 2019, Eisai and Biogen halted their drug study on the recommendation of a safety committee.
At the same time, pharmaceutical companies tried to wipe out amyloid beta a different way — using amyloid beta antibodies. These were designed to go directly after the amyloid beta molecule and flag it, so the brain's own immune system broke it down and carried it off, which is the way some cancer drugs work.
Initially, Siemers said, Eli Lilly got encouraging data on its amyloid beta antibody, called solanezumab.
... to abrupt endings
Meanwhile, by the mid-2000s, new brain scanning technology made it possible to peer into the brains of living people. As more people were scanned, it revealed something autopsies had shown earlier, but researchers had ignored.
Amyloid plaque doesn't correlate with dementia.
Roughly a third of cognitively normal older people have plaque in their brains. Plaque raises the risk of developing dementia later, but most people with plaque never develop dementia. To some researchers this increased doubt that amyloid beta is the cause of Alzheimer's.
Additionally, the scans also revealed that a quarter to a third of people with dementia don't have plaque. That meant that whatever is causing their dementia is completely unrelated to amyloid.
Eli Lilly's first big study of solanezumab had failed to slow mental decline. But Siemers saw a faint indication that the drug might have helped people with mild symptoms. He wanted to press ahead with another big amyloid study.
This time, in 2013, Eli Lilly paid for expensive brain scans to make sure all the volunteers had amyloid beta in their brains along with mild symptoms, a characteristic of the only group that seemed to benefit in a previous study. Siemers hoped that with a more carefully screened group solanezumab would work.
"These studies are ridiculously expensive, but I can tell you from my simple-minded scientist standpoint it wasn't really a hard decision," said Siemers. "It was like you have to do another experiment to prove that what you think is there is really there."
Siemers waited three more years and got his answer in 2016. The drug hadn't made a difference. "There were lots of tears," said Siemers, who still finds it difficult to talk about the failure years later.
After Eli's solanezumab crashed, hope shifted to amyloid beta antibodies at other companies, particularly Biogen's antibody aducanumab. In 2018, Dennis Selkoe, an Alzheimer's researcher at Harvard University who developed the amyloid hypothesis, called it "the best shot on goal."
Skeptics warned that his optimism — and the world's — was misplaced.
David Grainger, a venture capital investor in life sciences who has been critical of the amyloid approach, wrote in Forbes that the hype about aducanumab was "entirely excessive." Furthermore, he wrote that "there is a very real risk that some of the coverage unreasonably raises hopes of helping current patients."
Gibbs, the retired neurologist, had finished his initial 18 months in the study by then and chose to receive the drug in an extension study. He kept up his monthly flights to San Francisco until a common side effect — brain swelling — forced him to stop. He recovered, and thought it could be a good sign, as did many researchers, that the drug was removing plaque.
Then in March 2019 Biogen said it was stopping the trial early after a data-monitoring committee said it wasn't doing any good. The drug removed amyloid plaque but didn't slow the progression of dementia. Just three months earlier, Roche had pulled the plug on a big study of its amyloid antibody.
Gibbs learned from online news reports that aducanumab had failed.
"I was disappointed," he said. "I was hopeful that it really would be the first effective drug. I really thought that it had a good chance of working."
Gibbs would like to enter another drug study, perhaps one focused on people like himself who carry the APOE4 genetic variation that raises the risk for Alzheimer's as people age.
In the meantime, he's interviewing people from his life and writing his memoir. He's also exercising — up to 15,000 steps a day — following studies that suggest aerobic exercise may slow brain decline.
"The worst thing to do with Alzheimer's disease is to become a recluse and just not do anything," Gibbs said.
What was missed?
Where did things go wrong?
All drug studies are simply experiments that test a hypothesis. As trial failures mounted, the theory that amyloid causes Alzheimer's took a beating. The field is reevaluating the effect of its long grip on academic and drug research.
Khachaturian has mixed feelings.
The amyloid hypothesis was "a tool" for recruiting molecular biologists and led to important discoveries about brain proteins, he reasons. But because most scientists embraced it, amyloid boosters became the majority on committees reviewing research grant proposals at the National Institutes of Health and papers submitted for publication in top medical journals. This led to groupthink.
"The downside comes in when it becomes the only idea that everybody feels obligated to follow," Khachaturian said.
He compared it to a religion that rewarded adherents.
"The high priests get to be the leaders in academia," he said. "They get to be the advisers to the pharmaceutical companies and before you know it, you have a cabal of people who believe the same thing."
Critics say the focus on one molecule starved alternative approaches of research money and study volunteers, killing ideas that some say may have led to a treatment.
"We're using resources for unproven drugs in the same class when we could be spreading this out among different targets doing many smaller studies and perhaps finding success," said Schneider of USC. "We're making 'me too' drugs when none of them are proven."
Nikolaos Robakis, a neuroscience professor and Alzheimer's researcher at the Icahn School of Medicine at Mount Sinai in New York City, said evidence that amyloid beta may not cause Alzheimer's "was brushed away, was not really given the attention it deserved."
For example, he said it was known by the mid-2000s that some genetic mutations causing early-onset Alzheimer's do not increase amyloid beta.
Even so, he said younger colleagues felt pressured to incorporate amyloid beta research into their projects. "They were complaining that if the grant did not contain amyloid, they had less chance to be funded," he said.
Robakis added that it's impossible to know why particular grants are turned down, but he's convinced that "a number of good grants" were not funded because money instead flowed to projects that supported the amyloid theory.
Former Alzheimer's researcher Rachael Neve said the amyloid focus "suffocated" other research.
"I don't think it's an exaggeration to say that the refusal of the amyloidophiles to even consider other hypotheses has set back the field by 32 years," Neve wrote in an email response to a reporter's question. In 1987, Neve's lab at Harvard Medical School, along with three others, independently identified the big brain protein that enzymes snip into a small amyloid beta chunk.
She thinks the mutations were misinterpreted as a smoking gun pointing at amyloid beta when instead they likely cause damage by messing up important normal signals. Research into the loss of these normal functions wasn't a priority, she said.
"It's one of the great tragedies of biomedical research, that research into anything but amyloid was virtually impossible to get funded," wrote Neve, who said she abandoned Alzheimer's research in disgust in 2008.
Other researchers, however, stop short of saying the field shouldn't have pursued amyloid beta.
David Jones, an Alzheimer's researcher at Mayo Clinic in Rochester, Minnesota, said the theory was a necessary step in the evolution of Alzheimer's science. Now it's time to move on — to stop seeing amyloid beta as the trigger for brain failure and instead see it as the symptom. He likens plaque to trash that piles up at overly busy hubs in neuron networks. It's a sign of the brain working overtime to compensate for injury or overuse elsewhere that could be caused by many things.
Jones said the field fixated on amyloid beta partly because medical research — especially drug development — still treats complex diseases as if they were infectious disease where "you have one cause and if you cure that one cause you cure the disease."
Alzheimer's was improperly reduced to single molecule: "You see a protein. The protein must have caused the system to fail. You need to get rid of the protein. It's seductive and easy to understand, and it's wrong."
The focus on Alzheimer's plaque also distracted from studying other forms of dementia and whether dementia could be reduced by improving overall health.
"We focused the spotlight way too narrowly," said Adam Brickman, a Columbia University researcher who studies how blood vessel health affects brain tissue and cognition. "We need to ask, what else is going on? What other avenues can we pursue?"
For example, he said vascular disease raises dementia risk, but research hasn't been a priority. "If we had a real public health campaign around blood pressure, the incidence of what we always called Alzheimer's would go down," he added.
In fact, several studies found dementia rates have fallen in Europe and the United States in the recent past. Researchers think it may be the result of broader education and better heart disease management.
"These are huge effects," said USC's Schneider. "If a drug could achieve this, we'd be saying we have a near cure."
Kristine Yaffe, a neurologist and epidemiologist at University of California-San Francisco, has published studies suggesting a third of dementia cases worldwide might be due to low education and behaviors, including sedentary habits — behaviors dismissed until recently. "And now because of the failure of drug development and because ... some of the biology behind some of these risk factors is becoming clearer, people are getting more excited about them."
The NIH and the Alzheimer's Association recently started funding large clinical trials of interventions like blood pressure control, diet improvement and aerobic exercise in older adults.
"If there was a way for some company to make money on exercise as a therapeutic, it would have been tested a long time ago," said Laura Baker, a neuropsychologist at Wake Forest School of Medicine in Winston-Salem, North Carolina, who heads a federally funded exercise study.
Randall Bateman, an Alzheimer's researcher at Washington University in St. Louis, said it's unfair to say the field didn't test other targets. "The field has done that continuously since the beginning," he said.
"In hindsight you might say, 'Well that didn't work so we should've tried something else,'" he said "But the problem is that at the time the vast majority of the evidence did — and continues to — support amyloid as a central causative mechanism."
Bateman said the failure of drugs in people causes great concern but it's not time to give up. He and amyloid champions like Siemers say abandoning amyloid drug development now would be a tragedy for patients.
"Apollo 1 didn't land on the moon," Siemers said. "We don't know which Apollo it's going to be. You have to essentially learn from your mistakes. Every time you go just a little bit closer."
Scientists have ideas about why amyloid drug studies failed. Perhaps the drugs were designed to go after the wrong form of the molecule. Amyloid beta is a shapeshifter, appearing in different lengths and folds, clumping in different ways. "These drugs are highly specific," said Bateman.
Still, even if the moonshot lands, it most likely would only help people in the earliest stages of the disease — so early that people wouldn't even know they had it. Scientists have learned that changes in the brain that lead to Alzheimer's likely start years before dementia appears. By the time someone has memory loss, brain cells are dead, and scientists suspect nothing will bring those neurons back.
Given brain decline's insidious start, some scientists think amyloid drugs failed because they were started too late. Perhaps amyloid beta is like a match that starts a forest fire. Once the fire is roaring, it doesn't do any good to blow out the match. But if you blow out the match early enough, you might prevent the fire.
As a result, the same drugs are being retested for an earlier strike. The National Institute on Aging is running studies with pharmaceutical companies to test the amyloid-busting drugs in thousands of older people who have amyloid but who don't have dementia symptoms.
Washington University in St. Louis is also studying people who may carry mutations. DNA from families like the Reiswigs convinced researchers to focus on amyloid beta. Now these same families are a final testing ground of the amyloid hypothesis.
Five years ago, when he was 35, Reiswig enrolled in the study. Every month, a visiting nurse gives him a drug infusion in his family room. On a spring Sunday, his daughter melted chocolate on the stove to dip strawberries while the nurse monitored the drip of the IV. Either the drug solanezumab or a placebo was slowly fed into his arm for an hour. "I think regardless of the outcome, I will have known that I did everything I could," he said.
Meanwhile, studies continue to test amyloid-drugs in mutation carriers as young as 18. "This really is the ultimate test, I think, of the amyloid hypothesis," said Bateman. "If you prevent people from getting plaques in the first place ... can we prevent these people ultimately from getting the disease?"
The failures of the amyloid drugs have loosened the grip of the amyloid hypothesis on the field, but there hasn't been an alternative theory to replace it.
Between 2015 to 2019 annual federal government spending on Alzheimer's research quadrupled to $2.3 billion.
Ideas like inflammation, infection, and insulin are competing for money and acceptance. And drugs to reduce tau, the protein in Alzheimer's tangles, have moved through the pipeline and are now being tested in people with mild symptoms.
It's far from clear, however, if any of these other approaches will work. It's even less clear that a single drug will address the broad symptoms the public associates with Alzheimer's disease.
"I believe that is highly unlikely," said Peter Whitehouse, a neurology professor at Case Western Reserve University School of Medicine in Cleveland. "This is not a single condition caused by a single problem with one molecule. It's many conditions and it's related to aging.
"What I'm really concerned about is that we'll end up with a very expensive drug given to a large number of people that does next to nothing," he said.
The biology of the brain is more complicated than researchers thought. Rallying cries to end Alzheimer's by 2020 or 2025 have fizzled. What unifies the field is the agreement that whatever is done needs to start decades before memories start slipping.
Four decades after Robert Butler made Alzheimer's an urgent national research priority, he got much of what he wanted — public engagement, billions in funding, prestigious scientists.
But the world is still waiting for the drug cure he was so confident would come.
Betsy Towner Levine