Chapter 8

Chance Untamed and the Return of Fate

Life is an adventure in a world where nothing is static; where unpredictable and ill-understood events constitute dangers that must be overcome, often blindly and at great cost; where man himself, like the sorcerer’s apprentice, has set in motion forces that are potentially destructive and many someday escape his control … complete and lasting freedom from disease is but a dream.

—René Dubos, Mirage of Health1

A two-day meeting titled “Alzheimer’s Disease Research Summit 2012: Path to Treatment and Prevention” was held at the National Institutes of Health in Washington in May of that year. The stated goal of the summit was to find effective prevention and treatment approaches for Alzheimer’s disease by 2025, and leading Alzheimer’s researchers from around the world were present to give presentations and discuss how this could effectively be carried out.2 On the first day of the gathering Kathleen Sebelius, the secretary for Health and Human Services, emphasized that 16 million people (in the United States) will be affected with Alzheimer’s by 2050. And at the major AD conference of the year held in Vancouver in July, the audience was told that by 2050 there will be 36 million people in the world with AD—such oft-repeated mantras and others like them, ultimately targeted at government, create an urgency in connection with all discussion about Alzheimer’s these days, but they are repeated so often that many listeners have become tone-deaf, although it seems that, at last, governments are increasingly paying attention.3 Sebelius stated that the government had launched a website as a resource to enable people to find out more about the disease and caregiving, but the hope is also to attract people who will be willing to sign up as research subjects. Similarly, the Alzheimer’s Association has a website, one stated purpose of which is to attract individuals as research subjects.

Media reporting on the summit ran under the heading of “Obama Administration’s War on Alzheimer’s,”4 but it was also remarked that funding for Alzheimer’s research in the United States continues to be markedly less than for other major diseases: “Last year, the NIH spent $3 billion on research into AIDS, $4.3 billion on heart disease, and $5.8 billion for cancer,” according to the Alzheimer’s Association, and they note that these sums are a fraction of what goes to AD. President Obama signed the National Alzheimer’s Project Act into law in January 2011. In February 2012, the administration said it would push for a $156 million increase in funding for Alzheimer’s research over the next two years. At the time of the summit meeting, it was made clear that the proposed 2013 budget would allow for a further $100 million to support AD research and caregiving, but this funding still falls far short of that allotted to other diseases.

Writing for Scientific American in early 2012, Gary Stix comments that “[g]overnment declarations of war on drugs or disease often end in losing battles.”5 Stix cites Samuel Gandy, an Alzheimer researcher at Mount Sinai School of Medicine, who told Reuters, “[I]n my mind … [a war] provides the unfortunate sense that we will have ‘failed’ if we don’t have a cure by 2025.” However, in July 2012, the New York Times reported on a finding published in Nature that created a positive stir in the research world, one that suggested that AD research might indeed be on the right track with its dogged persistence in tackling the deposition of amyloid.6 As part of whole genome sequencing of nearly 1,800 Icelanders, Kari Stefansson’s team deCode Genetics, of which he is the executive director, found that about 1 in 100 Icelanders 85 years of age and older carry a coding mutation A673T in the APP gene. This mutation apparently protects these individuals against Alzheimer’s disease and cognitive decline and, furthermore, it appears to be effective even when people are homozygous for APOEε4. Those who carry this mutation proved to have an approximately 40% reduction in the production in vitro of amyloid producing peptides.

The article in Nature claims, “The strong protective effect of the A673T substitution against Alzheimer’s disease provides proof of principle for the hypothesis that reducing the β-cleavage of APP may protect against the disease. Furthermore, as the A673T allele also protects against cognitive decline in the elderly without Alzheimer’s disease, the two may be mediated through the same or similar mechanisms.”7 The researchers conclude that their finding provides indirect support for the theory that the processes of “normal cognitive aging” may “be shared, at least in part” with that of “the pathogenesis of Alzheimer’s disease.” And they propose that Alzheimer’s disease may represent “the extreme of age-related cognitive decline in cognitive function”8—thus strongly supporting an argument that acknowledges an entanglement of aging and dementia.

This same article in Nature states that over 30 coding mutations of the APP gene have been found, 25 of them pathogenic, almost all of which are associated with early-onset AD. It has also been shown that the protective mutation appears to be very rare indeed among North Americans, suggesting that it arose only relatively recently in Iceland. Gina Kolata notes in the Times, “[T]his find bolsters hopes of drug companies that have zealously developed drugs to reduce amyloid levels with the expectation that they might alter the course of the disease or even prevent it.”9 Samuel Gandy is quoted as claiming that this find is “extraordinarily important”—the most significant in the field since the first mutations associated with dominantly inherited Alzheimer’s were discovered 22 years ago. And John Hardy, who formulated the amyloid cascade hypothesis, stated that this research “is obviously right,” but added, “as provocative as the discovery of the protective gene mutation is, the strategy of reducing amyloid levels—the ultimate test of the amyloid hypothesis—still must be evaluated in typical [late-onset] Alzheimer’s disease. For example, perhaps people need to have lower levels of beta amyloid from birth to really be protected from Alzheimer’s disease.” This statement suggests that Hardy may be thinking in terms of a developmental approach to AD in which modulating factors present from birth may play a part (a point to which we will return).

At the Vancouver meeting, shortly after the article about the protective mutation had appeared, Hardy stood up at a packed scientific session on genetics to insist that the Iceland finding is not new, and had been made years ago (but presumably was not well publicized). He added that he was a reviewer for the Nature article, and not all his critiques of the draft paper had been dealt with satisfactorily. Hardy increasingly voices skepticism about the utility of the amyloid hypothesis, at least in the way it was originally formulated by himself, and virtually everyone would agree today that amyloid functioning continues to be relatively poorly understood, and that AD should be recognized as heterogeneous—insights that have significance for drug development.

In a 2010 article published in Alzheimer’s & Dementia, Iqbal and Grundke-Iqbal argued categorically that AD is multifactorial, and that at least five subtypes can be recognized. They pointed out that these subtypes may well respond differently to disease-modifying drugs and, for the purposes of clinical trials, should be separated out from one another. Another article that appeared in Nature in 2011, insisted, “[R]esearchers don’t know enough about the biology of Alzheimer disease to identify the right targets [for treatment]. The disease is the result of a long chain of events, but some of the links in that chain are still a mystery—nobody is certain which link to cut to stop disease progression.”10 Even so, the Icelandic finding of a protective mutation was referred to repeatedly at the Vancouver meeting, as were trial findings about a new drug designed to clear β-amyloid from the brain. This exploratory trial had involved intravenous injections of immunoglobulin in 24 people resulting in stabilization of AD symptom development in the early stages of the disease over the course of three years. In contrast to earlier research involving immunotherapy use, this study did not result in devastating side effects, and hence further research will be carried out with larger samples. This finding, very small though the sample was, created the most media interest among all the data presented at the Vancouver meeting. A cure is what experts and the public want to hear about.

The six-day meeting with over 4,000 participants was a lively affair; at last a breakthrough might be in sight in connection with Alzheimer disease, although most paper presenters were, at best, cautiously optimistic, if that. One plenary session and several other panels focused on ongoing GWAS research. Gerald Schellenberg presented details of the recently formed International Genomics of Alzheimer’s Project (IGAP) he now heads up—a consortium designed to replicate earlier GWAS findings. This consortium will also continue the hunt for other susceptibility genes and attempt to document on what pathways these genes are located. IGAP is located in 12 countries, and 20,000 SNPs are being tested. Other GWAS-related papers given at the meeting confirmed that both relatively common and rare variants appear to be implicated in AD risk, none of which have anything like the explanatory value that does APOEε4. In one paper, emphasis was given to the way in which such variants may be protective, and hence influence age of onset of AD, and the Icelandic findings cited above are likely to incite many similar papers. But these data were presented divorced of any contextualizing data that might suggest the social and environmental circumstances under which such genes are protective. If challenged, the involved researchers would no doubt counter that they are not yet at that stage.

A very large number of sessions at the Vancouver meeting were devoted to biomarkers, often described as endophenotypes—a reflection of the move to the molecular prevention of AD. One paper provided evidence that plasma biomarkers could be of use as a standardized diagnostic tool and that would improve on simple clinical assessments of prodromal AD. A whole session was devoted to longitudinal tracking of biomarkers and their age-associated appearance. Claims were made for strong similarities in biomarker changes in dominantly inherited AD and late-onset AD, even though they take place following a different time trajectory. Such findings furnish support for the entrenched belief that early- and late-onset AD are essentially the same condition, even though causation is different, but, as we will see shortly, a few months later second thoughts about this assumption moved to center stage.

A densely packed session was devoted to CAP—the Consortium for Alzheimer’s Prevention—composed of three groups of researchers in the United States who are loosely working together to try to make their respective clinical trials sufficiently compatible that generalizations can be made from their findings. One trial is known as “anti-amyloid treatment in asymptomatic Alzheimer’s disease,” or A4 for short. This is a primary prevention trial in which Reisa Sperling, at Harvard University, who has been very active in initiating the move to AD prevention, is involved. The objective of this project is to establish how best to treat older individuals at risk for developing Alzheimer’s disease. Risk estimation is based on biomarker evidence of amyloid deposition by means of imaging. The hypothesis being tested is that by decreasing amyloid burden during the preclinical stages of AD, “downstream neurodegeneration” and cognitive decline will be delayed or avoided altogether.

A second trial, is associated with DIAN (Dominantly Inherited Alzheimer Network, discussed briefly in chapter 5) and is described as the largest and most extensive worldwide network investigating mutational Alzheimer’s. This network is headed up by John Morris and Randall Bateman at Washington University and is now preparing to launch a prevention trial funded by the Alzheimer’s Association and a consortium of 10 pharmaceutical companies (the DIAN Pharma Consortium). The conclusions drawn thus far by this group stated in the Vancouver meeting abstract are as follows:

Because the clinical and pathological phenotypes of dominantly inherited AD appear similar to those for the far more common late-onset “sporadic” AD, the nature and sequence of brain changes in early-onset AD are also likely relevant for late-onset AD. Clinical studies in AD caused by gene mutations are likely to pioneer the way to prevention trials for all forms of AD. The scientific knowledge gained from secondary prevention trials is likely to inform about the cause of AD, validate biomarkers to accelerate treatment development, and determine the effects of treating AD early.11

The third trial discussed in this panel was the one that will make use of subjects enrolled from the Basque families living in Colombia (see chapter 5). The conclusion to the abstract of this paper stated boldly, “We are excited about our progress, plans, current timelines and the chance to work with other researchers, programs, and stakeholders. Together, we have a chance to set the stage of a new era in AD prevention research, develop the resources, biomarker endpoints, and accelerated regulatory approval pathway.”12 In his presentation Eric Reiman concluded by stating that this trial is “giving people at the highest risk an opportunity to fight for themselves and the field.” A palpable excitement pervaded the presentations at this panel, but, equally, so did a sense of urgency about the need to prevent AD, and the singular role that will be played by families plagued by this condition who, the audience was told, were finally getting the attention they had longed for. At the conference Reiman reported on a second study in addition to the one in Medellín involving a national registry in the United States of between 20,000 to 50,000 people who have been identified as carrying the APOEε4 gene associated with increased risk for late-onset AD. The aim of this trial, using 400 of these individuals, will be to test yet another new anti-amyloid drug.

One of the plenary sessions at the Vancouver meeting was devoted to amyloid, aging, and neural activity, and emphasis was given to the way in which tracking of cognitively normal people for amyloid deposition could well provide insights into the AD disease mechanism. The abstract for this presentation stressed, “[I]t is unclear why β-amyloid is deposited in some individuals and not others, why some people develop cognitive symptoms while others do not, and how Aβ affects brain structure and function.”13 And a recent article in Alzheimer’s & Dementia reminds readers that although Aβ neurotoxicity has been clearly demonstrated in vitro, the role of Aβ in neurodegeneration has not been “unequivocally” demonstrated in vivo.14 In short, stubborn deficits persist in expert understanding of amyloid.

A featured research session had taken place prior to the plenary session, in which Robert Green, the PI of the “evidence-based” REVEAL trial, presented a paper dealing with preliminary findings in connection with disclosure to individuals diagnosed with MCI of their APOE status. In addition, these subjects were given an estimated risk of “conversion” to AD in the following three years.15 Another paper in this session focused on plans for “ethical” disclosure of their “amyloid status” to clinically normal research subjects. These individuals will be subjects in biomarker studies as part of the ε4 trial noted above. Amyloid-positive individuals will be in both treatment and placebo arms, and amyloid-negative individuals will be placed in the “natural history” arm of the trial. During the presentation it was noted that uncertainties are associated with the “likelihood and timing of clinical progression to AD” even when amyloid can be detected in the brain, and it was stressed that safeguards about disclosure will be embedded in the trial—these safeguards will be based on findings obtained from the REVEAL study. At the same time, the impact on individuals of learning about their amyloid status will be systematically evaluated as part of the trial.16 Another paper stressed that methods developed in the field of genetic testing “provide a template for how to disclose amyloid imaging results to asymptomatic older adults.”17

Over 1,400 posters were on display at the meeting. The majority dealt with biomarkers; others dealt with GWAS, PIB scanning, hippocampal structure and volume, and cognitive testing. Several dealt with sleep patterns—a current hot topic; others stressed the importance of physical activity in staving off AD, one emphasized the positive effects of ballroom dancing as protective against AD, and another claimed that the value of meditation had been underestimated. Yet other posters dealt with olfactory changes as diagnostic of AD, and several emphasized the importance of recognizing changes in gait as a signifier of incipient or early stages of AD—changes that can easily be detected in the office of a GP.

I found many skeptics at the meeting about the move to molecular prevention of AD, and a few outraged individuals among the expert audience with whom I mingled. However, judging by the questions at the end of the sessions I attended, many researchers were captivated by the enormous amount of time and energy being invested in order to move the world of Alzheimer’s research forward toward its current goal of prevention and/or treatment of AD by 2025—at least in the United States. It is undeniable that achievement of this goal in the manner in which it is at present conceptualized will be dependent upon better elucidation of the activities of amyloid—the capricious actor that thus far has refused to be tamed. One participant at the conference, who has attended this event many times in both Europe and North America, told me that many of the individuals with whom she had talked in Vancouver were commenting that the messages embedded in the talks were much less dogmatic than in previous years, a sure sign that presenters are not assuming that major breakthroughs are imminent, even though an aura of excitement was present in some quarters.

A Worldwide “Call to Arms”

Turn him [the King] to any cause of policy,

The Gordian Knot of it he will unloose,

Familiar as his garter.

—Shakespeare, Henry V, Act 1, Scene 1

In the issue of Alzheimer’s & Dementia that appeared in the same month as the July 2012 Vancouver conference, an exceptionally long article about the development of biomarkers concluded by asserting that what is needed now in connection with AD prevention is a “swift and serious global ‘call to arms’ AD initiative” that will unite and integrate global interdisciplinary translational research.18 This article, written by two researchers in the department of Psychiatry at the University of Frankfurt together with Zaven Khachaturian, now heading up the Campaign to Prevent Alzheimer’s Disease by 2020, is titled “Development of Biomarkers to Chart All Alzheimer’s Disease Stages: The Royal Road to Cutting the Therapeutic Gordian Knot.”

This title suggests that these authors continue to think in terms of clearly definable stages in the accumulation of neurological changes associated with AD—an assumption that present research strongly indicates is inappropriate, as do certain of the findings discussed and cited in the authors’ own presentation. However, the goal in setting out this call to arms is to circumvent the perceived limitations of research to date—research that has tossed up so many anomalies. Standardization is the order of the day. As readers well know, the legend of the Gordian Knot recounts how Alexander the Great used his sword to cut through a massive knot that no one could remember how to untie. This idiom is usually used today to convey the meaning of cutting through extraneous, obfuscating complexities to get to the point of some argument, debate, or activity. Hampel et al. argue in their article, “[T]o date, systems-based, integrated, comprehensive, validated, and qualified Alzheimer’s disease (AD) biomarkers seem to be the much desired golden touch to ‘cutting the Gordian therapeutic Alzheimer knot.’ ”19 The metaphors are mixed, but one gets the idea—well-honed biomarkers will, sword-like, solve the Alzheimer conundrum.

In their article, after first setting out the obligatory reminder about the “looming medicoeconomic” crisis, noting that certain governments are finally responding to this challenge, and claiming that “a modest delay of 5 years in the onset of disability will reduce the cost and prevalence of the disease by half,” Hampel et al. go on to stress the impatience shared by researchers, advocacy groups, and caregivers alike to resolve the AD problem. The challenge now, they insist, is to develop biomarkers as clinical tools that will advance the AD world into the “unknown territory” of reliably identifying asymptomatic people at risk for AD, and they call for a shift to a “systems biology” approach in order to accomplish this. As far as these authors are concerned, to make prevention a realistic goal, the time to develop appropriate therapies must be shortened to three to five years or thereabouts—a goal that will entail the validation of biomarker changes on a massive scale among “large, diverse populations.” This cannot be accomplished without the building of a massive infrastructure worldwide that will create a comprehensive longitudinal database including cohorts of “physically and/or mentally normal and/or ‘successful’ healthy aging people,” as well as individuals at elevated risk for AD, and also asymptomatic preclinical/prodromal MCI individuals. The key to this endeavor is the use of high throughput technologies, designed to produce an omnibus database that will enable the global standardization of biomarkers. The limitation of current biomarker research to date, it is noted, is that individuals used as research subjects thus far have been too “heterogeneous,” and larger cohorts must be used.20 Hence, the recruitment for research purposes of vast cohorts of healthy volunteers worldwide is now under way.

The enormity of what is proposed is staggering, and, obviously, the project cannot be global due to the lack of access of millions of people to any form of formal medical assistance in so many parts of the world. Others have to walk for hours to receive attention for any kind of problem, however urgent. And in very many countries, even when some health care is in theory available, the resources and expertise needed to track volunteer research subjects are simply not present. Moreover, the expectation that younger people will volunteer as research subjects in droves for repeated invasive testing, wherever they live, is quite remarkable. Who is going to be willing to submit to a form of corporeal citizenship that demands years of regular spinal taps and neuroimaging, aside, perhaps, from the spouses of affected individuals? The bonds of altruism will be sorely tested. Considerable monetary rewards might change the situation, especially among the poor, malnourished, and destitute, but this raises the specter of exploitative commodification similar to that already seen in connection with organ procurement and with drug trials carried out in economically deprived countries.

In addition, the underlying assumption is that by systematically making use of huge samples, anomalies will no longer be significant, leaving in place “pure” refined biomarkers strung along an orderly temporal scale commencing from age 40 or 50 that leads inexorably forward to Alzheimer’s disease. All other implicated variables extraneous to this model are set to one side to expose the newly revised natural history of Alzheimer’s, a condition anchored firmly in localization theory that now has a detectable presymptomatic phase involving amyloid deposition—the key to AD prevention. But as John Hardy has noted, it may be that some differences with respect to amyloid among individuals are present from birth—perhaps we do not all have the same levels of amyloid from the outset. Furthermore, it is still not known exactly what role amyloid plays in this conundrum.

A Nail in the Coffin of the Amyloid Hypothesis

At a press conference at the Vancouver meeting William Thies, at that time the Alzheimer’s Association chief medical and scientific officer, made a statement designed to up the ante in connection with AD research: “The good news at the Alzheimer’s Association International Conference is that we are making advances toward earlier detection of Alzheimer’s, greater knowledge of dementia risk factors, and better treatments and prevention.” Thies continued, “These advances are critical in order to create a future where Alzheimer’s disease is no longer a death sentence but a manageable, treatable, curable, or preventable disease.” He then reminded the press, “The soaring global costs of Alzheimer’s and dementia care, the escalating number of people living with the disease, and the challenges encountered by affected families all demand a meaningful, aggressive and ambitious effort to solve this problem. … The urgency is clear. By midcentury, in the U.S. alone, care for people with Alzheimer’s will cost more than $1 trillion. This will be an enormous and unsustainable strain on the health care system, families, and federal and state budgets. The first-ever U.S. National Plan to Address Alzheimer’s Disease was unveiled in May, and must be speedily and effectively implemented.”21

But, on July 23, less than a week later, an article appeared in the New York Times under the heading “Alzheimer’s Drug Fails Its First Big Clinical Trial.” Reporter Andrew Pollack notes that this deals “a blow to the field, to a theory about the cause of the disease, and to the three companies behind the drug.” One of the companies is Pfizer, and they reported that the phase 3 trial of their drug, bapineuzumab—produced together with Johnson & Johnson, with further financial input from Elan—improved neither cognition nor daily functioning of the patients to whom it was given, as compared with a placebo. Readers were informed that “most doctors and Wall Street analysts had been expecting the drug not to succeed,” because the phase 2 trial had not been statistically significant. At the Vancouver meeting there had been no sign that this failure could be imminent, and the leaders in the field must have been decidedly worried as they gave their upbeat messages. It is significant that in this trial the subjects, 1,100 Americans with mild to moderate AD, were all APOEε4 carriers, but the phase 2 part of the trial had indicated that the drug had a better chance of working with individuals without that genotype. In early August, following further disappointing results, in which noncarriers of the APOEε4 gene were the trial subjects, Pfizer and Janssen announced that they were completely halting development of bapineuzumab for mild to moderate Alzheimer’s disease. It is still possible that this drug will be made use of in trials designed to assess its worth in the prevention of AD. In the proposed trial in which Colombian families will participate, it is planned to use a closely related drug—a monoclonal antibody, solanezumab, designed to bind β-amyloid.

Not surprisingly, involved scientists have responded quickly to these findings by arguing that no doubt these drugs are being given too late to individuals already badly affected by AD neuropathology, even though clinical symptoms are mild. Benefit will quite probably result if these drugs are used to prevent the development of unwanted plaque before it commences: “All these symptomatic trials are 25 years too late,” Samuel Gandy of Mount Sinai School of Medicine is reported to have said.22 This failure, and the findings of other similar trials if they too prove to be unsuccessful, or produce only equivocal results, will be made use of to boost the shift to the molecular prevention of AD. Moreover, Eric Reiman impressed on his audience in Vancouver several times that trials using families with dominantly inherited AD can be done much faster than those that make use of older late-onset subjects in whom the disease progresses much more slowly—thus saving not only time but also money.

In August, following the Vancouver meeting, it was reported that solanezumab had failed to slow memory decline in two late-stage studies of about 1,000 patients in each study. But officials at Eli Lilly were “encouraged” even so, because when they combined the results of both studies using a sample composed of patients who were at only a mild stage of AD, then the results could be interpreted as statistically significant. This very tentative finding encouraged those who believe that if one administers the drug very early, long before cognitive symptoms are apparent, then it may well work to deter disease progression.23 By October, the drugs that would be used in the DIAN study had been announced: Gantenerumab and solanezumab together with another possible drug. This trial involving 160 people is designed to try “to prevent Alzheimer’s symptoms from ever occurring,” said John Morris. “[T]his will be a new strategy.”24

Even though the move to prevention is boldly described as a paradigm shift involving a new strategy in the hope, no doubt, of sustaining the attention of both researchers and key funders of the AD enterprise, it is clearly something less that that. For the time being the amyloid hypothesis hangs on, battered and bruised though it is, and remains the key postulate even as prevention moves to the fore. But increasingly researchers are being forced to confront the idea that this hypothesis may well be past its due date, and is in rather urgent need of modification or retirement.

Two articles and associated commentary that appeared in The Lancet Neurology in December 2012, based on preliminary findings with the Colombian families, make clear the extent to which some serious reflection has to take place in the AD world. Before the publication of these articles, researchers had been convinced, on the basis of brain-imaging and CSF analysis, that amyloid plaque deposition can commence many years before any signs of clinical symptoms become manifest. This latest research confirmed this finding and, among the Colombian families who carry the early-onset mutation, such changes were detected up to 20 years before any signs of clinical symptoms, and in individuals as young as 18 to 26. These results give support to the idea that early detection of presymptomatic biomarkers is essential if a successful drug regimen is to be developed.25

Also of great significance was the finding written up in these articles that among early-onset mutation carriers a buildup of amyloid plaque probably takes place due to excessive production of the Aβ peptide, whereas in contrast, in late-onset AD, amyloid deposition, including cases where the APOEε4 is involved, takes place due to an inability to clear Aβ peptide from the system. This finding, admittedly with a very small cross-sectional sample, suggests that significant differences exist between dominantly inherited and late-onset AD, even in the final molecular pathways, and not simply in terms of causation—thus throwing into jeopardy the assumption that early and late-onset AD are for all intents and purposes the same condition. Equally of interest was the remarkable finding that neurodegeneration involving reduced gray matter and altered synaptic functioning, among yet other changes, may take place in advance of amyloid plaque deposition. Not surprisingly, these findings are regarded as exploratory by the researchers, and the research has limitations. However, if the findings hold up, and particularly if such changes are also well demonstrated in late-onset AD cases, then 100 years of virtually unchallenged neuropathologic criteria for an Alzheimer diagnosis will need modification. Amyloid plaques may not be, after all, the first sign of the AD phenomenon.

These preliminary findings will be followed shortly by many more findings derived from the consortia working on dominantly inherited AD. It is likely that heated debate about the amyloid cascade hypothesis will erupt, pushing the AD world deeper into a serious shake-up of normal science. In the meantime we are informed that people who suffer from disrupted sleep might be on the path to Alzheimer’s disease—“increased daytime sleepiness is the biggest predictor [of AD].” Work on mouse models suggests that the beginnings of plaque formation may cause sleep disturbances, but it is acknowledged that the situation may be more complex in humans.26 And we know that a very large number of people with plaques in their brain never become demented—no mention of this in the article.

In the remainder of this chapter, following a very brief discussion about the emergence of molecular genomics, I turn to a new approach in AD research oriented by postgenomic and epigenetic knowledge. This type of research receives considerably less coverage in professional settings and in the media than does the search for drugs, but if the future impact on the global economy of aging societies is to be confronted, it should be given greater salience than it has received thus far.

Beyond the Dogma of Genetic Determinism

Over the course of the past decade many remarkable changes have taken place in the world of molecular genetics. One outcome has been that genes have been demoted in the minds of many, perhaps the majority of experts in the world of genomics, from “real,” substantial entities, to the status of a concept. Not surprisingly, despite their changed status, genes continue to be heuristically very powerful, even though research has made it clear that scientists cannot determine exactly where genes begin or end,27 nor are genes stable, and they do not, on their own, determine either individual phenotypes or the biological makeup of future generations. Quite simply, genes are not “us,” and the gene can no longer pass as the fundamental animating force of human life; it has been dethroned, Evelyn Fox Keller informs us, from its place as “part physicist’s atom and part Plato’s soul.”28

It is paradoxical that this definitional disarray of the gene was brought to a head as a result of the Human Genome Project. As is now well known, when mapping the human genome, involved scientists labeled 98% of the DNA they had isolated as “junk” because it did not conform with their idea of how the blueprint for life was assumed to work. In recent years things have changed dramatically, and junk DNA, thrust summarily to one side in order to focus on the task of mapping only those genes that code directly for proteins, can no longer be ignored. This junk, although most of it appears to be nonfunctional and does not code for DNA, is nevertheless clearly implicated in gene expression and regulation at times, and hence is being sifted through systematically.29 Furthermore, the activities of noncoding RNA are believed to compose the most comprehensive regulatory system in complex organisms.30 Noncoding RNA has been shown to profoundly affect the timing of processes that take place during development, including stem cell maintenance, cell proliferation, apoptosis (programmed cell death), the onset of cancer, and other complex ailments.31 Consequently, the research interests of many molecular biologists are no longer confined largely to mapping structure, but have expanded to the elucidation of the mechanisms of cell and organ function throughout the life span of individuals, and also through evolutionary time. Central to this endeavor is to understand gene regulation—above all, how, and under what circumstances, genes are switched “on” and “off”—in other words, what brings about their expression.

Using this approach, the effects of evolutionary, historical, environmental, and cultural variables on developmental processes, health, and disease are brought to the fore. Determinist arguments are in theory no longer appropriate, and both micro- and macro-environmental interactions in connection with cell functioning are key to this type of research. This emerging epigenetic knowledge (as it has come to be known), in theory if not in practice, is a serious challenge to the central dogma on which molecular genetics was founded. Metaphors associated with the mapping of the human genome—the book of life, the code of codes, the holy grail, and so on—are now outmoded. With the cell at center stage, genetic pleiotropy,32 gene/gene, gene/protein, and gene/environment interactions cannot conveniently be set to one side—and GWAS researchers have quickly discovered just how challenging this complexity makes their research endeavor. A space has opened up between genotype and phenotype, a zone of endophenotypes—unstable, interim states—partially recognized 100 years ago, but then set to one side until relatively recently.33 Today, many endophenotypes are recognized as key biomarkers, researched in order to better understand and perhaps prevent complex disease, as was evident at the Vancouver meeting.

The molecularized universe has turned out to be so very much more complicated, and exciting, than most people had imagined. This is a universe entirely in tune with postmodernity, a landscape littered with a pastiche of shape-shifters—smart genes, transcription factors, jumping genes, and so on—an environment of the unexpected, in which boundaries formerly thought of as stable are dissolved. It is evident that some genes code for more than one protein, while many others do not code for proteins at all. Increasingly, it has become clear, with the partial exception only of single gene disorders, that multiple factors, including events both internal and external to the body, serve to enhance or inhibit gene expression. This means that efforts to divine individual futures by means of genetic testing for anything but the rare single gene disorders are precarious indeed.

In her book The Century of the Gene, the historian and philosopher of science Evelyn Fox Keller summed up where she believes we stood at the beginning of the 21st century:

Genes have had a glorious run in the twentieth century, and they have inspired incomparable and astonishing advances in our understanding of living systems. Indeed, they have carried us to the edge of a new era in biology. … But these very advances will necessitate the introduction of other concepts, other terms, and other ways of thinking about biological organization, thereby loosening the grip that genes have had on the imagination of the life sciences these many decades.34

Fox Keller, although she agrees that the concept of the gene is “good enough” for many experimental purposes, concludes that only the adoption of new concepts will bring about timely insights into the workings of living systems. But the challenge is enormous. At the Vancouver conference Gerard Schellenberg was asked an apparently naïve question by someone in the audience following his presentation on GWAS findings in connection with AD: “Isn’t it well known that gene/environment interactions are at work at all times and that genes cannot simply be researched in isolation?” Schellenberg took a deep breath and smiled wryly. In response he said, “That’s true, but we have to do the easy stuff first, and then we can move on to the environment.” He gave no hint that such a reductionistic approach produces, in effect, decontextualized findings that may well, ultimately, lead investigators astray.

Epigenetics: An Expansion of Horizons

In the beginning was a cell. Well, not in the very beginning—only after a mere 1 or 2 billion years of life’s story, during which time most our basic principles were already well established.

—Kenneth M. Weiss and Anne V. Buchanan, The Mermaid’s Tale35

The philosopher of biology Paul Griffiths noted a decade ago, “[I]t is a truism that all traits are produced by the interaction of genetic and environmental factors [but] the almost universal acceptance of this view has done little to reduce the prevalence of genetic determinism—the tendency to ignore contextual effects on gene expression and the role of non-genetic factors in development.”36 Space does not permit a detailed summary of current epigenetic theories; suffice it to say that the very word “epigenetics” has more than one meaning. Many argue that the discipline is not new, and existed by the 1940s or even earlier, although others disagree with this position.37 Most current research into epigenetics focuses on the expression and regulation of genes. For example, questions posed about the phenotype ask why monozygotic twins do not always manifest the same diseases and, when they do, why the age of onset can differ by up to two decades, as is the case for dominantly inherited Alzheimer disease. This narrowly conceptualized epigenetic approach immediately makes the limitations of genetic determinism patently evident.

A broader approach to epigenetics, known by its adherents as “developmental systems theory” (DST), is now well established. The starting point of the DST approach is an ontological reversal of genetic determinism, and it gives priority to dynamic interactions among very many variables, allowing for numerous possible outcomes. Barnes and Dupré, sociologist and philosopher of science respectively, argue that “instead of being spoken about as independent atoms of hereditary material, genes, conceptualized as DNA, are now referred to as parts of the chemical/molecular systems within the cell.”38 In their book Nature After the Genome, Parry and Dupré argue that DNA is not simply involved with heredity; we have to ask what DNA does throughout the life cycle. It is not appropriate to “conceptualize nature as passive, something upon which humans act, usually with the assistance of technology, but rather as active and lively, as responding to human actors, and as able to resist them.”39

In a 2010 article, Dupré insists that the dogged idea of individual genetic homogeneity over the life cycle, despite ever increasing evidence to the contrary, is highly misleading. He argues that an assumption of homogeneity persists because too much weight has been given to comparisons of genomic sequences, but “to know what influence a genome will actually have in a particular cellular context one requires a much more detailed and nuanced description of the genome than can be given merely by a sequence.”40 Such epigenetic changes to genomes are brought about by chemical interactions among the molecules within the cell membrane that surround the genome, usually composed of RNA and proteins. Similarly, the molecular biologist Strohman has argued that “there are regulatory networks of proteins that sense or measure changes in the cellular environment and interpret those signals so that the cell makes an appropriate response.”41 This regulatory system, a “dynamic-epigenetic network,” has a life of its own that is not specified by DNA. Understanding about the significance of DNA has been radically altered using this new approach, and contingency now takes center stage. For commentators concerned with ontology, the idea, implicit in so many arguments, of DNA as having “agency” is thoroughly anthropomorphic and inappropriate.42

Increasingly complex research at the molecular level proceeds apace, resulting in many remarkable findings. For example, a recent study is grounded in the idea of competing and integrated biological pathways within cells that control the “biogenesis, folding, trafficking and turnover of proteins present within and outside the cell.”43 Without doubt this type of research may have relevance for an increasingly fine-grained understanding of Alzheimer-related neuropathology, and possibly be of significance for drug development; but at this level the question of AD causation is severely truncated and limited to late-stage molecular changes.

Buzzing Confusion

The molecular biologist Richard Lewontin penned a radical critique several years ago about current approaches to biology in which he questions the common assumption that laws of nature determine biology, as is true for Newtonian physics. Writing about levels of organization with respect to organisms as a whole, he states,

Unlike planets, which are extremely large, or electrons, which are extremely small and internally homogenous, living organisms are intermediate in size and internally heterogeneous. They are composed of a number of parts with different properties that are in dynamic interaction with one another and the parts are, in turn, composed of yet smaller parts with their own interactions and properties. Moreover, they change their shapes and properties during their lifetimes, developing from a fertilized egg to a mature adult, ending finally sans teeth, sans hair, sans everything. In short: organisms are a changing nexus of a large number of weakly determining interacting forces.44

Lewontin wonders if biology is inevitably a story of “different strokes for different folks,” a collection of exquisitely detailed descriptions of diverse forms and functions down to the molecular level—or, from “this booming, buzzing confusion,” can a biologist perhaps derive some general claims that are freed from the “dirty particulars” of each case? “Not laws,” he quickly adds, but at least some widely shared characteristics? He agrees with Fox Keller that both history and epistemology seem to speak against this, and as far as making sense of life—of biology—is concerned, all our models, metaphors, and machines, while they have contributed much to our understanding, provide neither unity nor completeness. On the contrary—facing up to complexity is the order of the day.

Lewontin is well aware, of course, that not all biologists are comfortable with the emphasis he gives to complexity and the “confusion” he associates with the functioning of living organisms. But it is becoming increasingly hard to ignore the reality that organisms of all kinds can and do adapt to new environments, toxic insults, and manipulations of various kinds with surprising rapidity. Whether it be the beak size of the finches studied by Charles Darwin on the Galapagos Islands,45 the dramatic change in the reproductive life of cod in response to contemporary overfishing,46 the development of resistance on the part of micro-organisms in response to antibiotics, and so on, it is no longer appropriate to think of biological change as a very slow unfolding of events, and the same is now beginning to be recognized in connection with human biology.

The biological anthropologists Kenneth Weiss and Anne V. Buchanan, in their article titled “Is Life Law-Like?” spell out some of the specific implications associated with an assumption that nature is “governed by universal, unexceptional laws.” They argue, “A law of nature is a process or mechanism of cause and effect,” and “we pursue the laws of nature through what has become known as ‘the scientific method.’ ” This involves, of course, systematic, controlled observation—an empirical approach—in which reductionism, replication, prediction, and the ability to deduce new facts are central. And notably in the 20th century, the idea of predictability has been increasingly extended to include “the law-like distributions of probabilistic processes.” The effects of this approach have had the result of conceptualizing “nature as, in effect, deterministic.” Weiss and Buchanan agree that a very small fraction of DNA related causality is highly predictable, but they insist that recent technological developments are “enabling us to see that, in important ways, life might not be law-like in the Enlightenment sense, or even that we may not know when we have found such laws.”47 In Weiss and Buchanan’s opinion, such findings do not challenge empiricism, but they do question current empirical approaches to causation and inference: “Much of life seems to be characterized by ad hoc, ephemeral, contextual probabilism without proper underlying distributions.”48 In other words, John Stuart Mill’s idea of “Western fatalism”—a belief that we can know how things will turn out because the scientific order follows regular patterns,” should, it seems, be rescinded, and fate, rather than fatalism, can reclaim some recognition.49

The micro-mechanisms of basic biological processes are not in contention when biologists make such arguments—rather it is assumptions about the regularity of causal processes that are at issue, with enormous implications in connection with creating generalizations based on population-based data to be applied to individuals wherever they reside.

Intimations of the Future?

It is perhaps the case that certain members of the younger generation are already disaffected with what they think of as scientific reductionism. In 2007 informal, exploratory conversations were held with 30 university-educated individuals aged between 25 and 39 who reside in Montréal in connection with their exposure to, understanding of, and interest in genetics.50 Only one or two of these people had experience of serious disease in their families, and none reported single gene disorders.

All 30 of these individuals acknowledged the significance of genes in disease causation, yet virtually every one of them also cited social, environmental, and behavioral variables, including upbringing, education, economic status, environmental pollutants, diet, and personality, as contributors. As Keith, age 31, put it, “[G]enes don’t give us the whole picture.” He went on,

A condition might come from genetics, but it’s also your mind, and your education, and family, and all these things … that is what makes it hard to understand how genes affect each and every individual. It’s so complex, how your body works, how your mind works and how other factors affect you.

In many cases these interviewees appeared to think that external factors can in effect assail genes. Candice, age 28, argued,

If your genes aren’t “strong enough” to fight a lot of the chemicals and external things … then you could get cancer because your body wasn’t able to withstand the chemical intrusion.

And Joyce, 31, noted,

I could be at high risk of some crazy, rare disease. Maybe I’ll just never know because by coincidence I have avoided anything that would turn that gene on or off. It just sits there silently and nothing ever happens to it.

To mitigate external threats, these individuals engage in everyday prevention strategies, such as eating organic food, taking vitamins, exercising, and controlling their weight. They believe such activities are essential in maintaining healthy gene/environment bodily interactions, and many produced vivid accounts of mediating “toggles” and “triggers” that influence gene expression by turning “switches” “on” or “off”:

—I imagine in a disease process it’s just like tripping the switch. So if you smoke, you’re just flipping a lot of switches which otherwise would never have been switched. … [D]oing things like eating lots of vegetables and consuming antioxidants and things keep the switches on or off, whatever they need to be, that’s protective. (Joyce, 31)

Something in the environment, whether it’s too much food, or working too much, could cause a short-circuit, an overload in the capacity. Something will set the gene off, whether cigarette smoke, or pollutants, a toxin in food, whether it’s food coloring, whether it’s a certain chemical. Something will set it off. (Larry, 34)

Some individuals suggested that because they have “greater access to resources and knowledge” than their parents, they are less at risk than the previous generation. Larry comments,

[Y]ou can have a predisposition to having diabetes, but if the offspring’s diet, and activity level are high, and they’re conscious of what they’re eating, of their habits, and their stress level, this is something that possibly can be avoided.

Joyce reflects on the history of breast cancer in her family:

My great-grandmother, grandmother, and mother had it—but also let’s look at their lifestyle. None of them breast-fed. My grandmother was a smoker. They all have really high fat diets, low exercise, a lot of alcohol. Maybe I inherited the predisposition, but my lifestyle’s so different.

Individuals such as Joyce actively try to distance themselves from family medical histories by emphasizing that it is possible to have a degree of control over genes. Another person surmised that he might avoid serious illness, unlike his grandfather, who was raised on a farm, performed heavy physical labor, and “lived a life that’s so different from me and in a different environment from me.” The majority shun the idea of a determined genetic history, focusing instead on individual responsibility and decision making as the key to positive health outcomes. Some insist that although they share genetic substance with their parents, they themselves have “transformed” their genes due to generational differences in lifestyle.

These conversations suggest that these young people think of genes as, in effect, unstable entities, subject to modification by environment and human behavior, and the impression is given that a good number believe that through their own exertions they may well be able to contain or subvert “risky” aspects of their genetic heritage that may have been passed along to them. It seems that neither fatalism nor fate applies here, but that dominant values of middle-class youth of today—self-control and discipline—are at work. Some of these individuals are destined to have their beliefs shattered, of course, and one wonders how they will manage the chronic uncertainty associated with increased genetic and biomarker testing, if and when that takes place. But, in the case of Alzheimer’s, dietary care and exercise have been shown repeatedly to stand individuals in good stead in order to sustain the cardiovascular system and thus protect from stroke and mixed dementias. Without doubt it is all for the best to cultivate such everyday habits, whatever the driving values behind such behaviors. What must be kept in mind is that by far the majority of people living in the world today are in no position to cultivate their health and well-being, a situation becoming rapidly worse with unrelenting efforts at exploitative economic growth and development, the gains of which are only very rarely indeed redistributed fairly among local populations.

Epigenomics and the Life Experiences of Individuals

People sometimes say that the human brain is the most complex item in the universe. But the whole person of whom the brain is part is necessarily a much more complex item than the brain alone. And whole people can’t be understood without knowing a good deal both about their inner lives and about the other people around them. Indeed they can’t be understood without a fair grasp of the whole society that they belong to, which is presumably more complex still.

—Mary Midgley, Science and Poetry51

Systematic research into epigenetics and the larger field of epigenomics (the global analyses of epigenetic changes across the entire genome) has recently exploded. Many researchers recognize what might be termed a “distributed molecular agency” with the cell at center stage in which the focus of the majority of basic scientists, with some notable exceptions, remains concentrated on microenvironments within the body. It is now well established that, at the molecular level, environmental variables can bring about effects on cellular processes at specific DNA sites, mediated by several processes, the best known of which is methylation. It has also been shown that, under certain circumstances, methylation and demethylation and related processes can be reversed.52 And some researchers claim that these epigenetic changes can be inherited independently of DNA.53 This “epigenomic” research has opened the door to what has been described by some as neo-Lamarckianism, although these findings are only associational and are by no means fully accepted as yet.54

Research into the epigenetics of Alzheimer disease is under way, although to date it is relatively sparse. One study showed links between memory formation and epigenetics.55 In another project an analysis of DNA methylation across 12 potential Alzheimer’s susceptibility loci was carried out. It was found that “age-specific epigenetic drift” from a previously established norm was apparent in brain tissue taken from individuals who had been diagnosed with AD, as compared to normal controls. The authors of this study argue, “The epigenome is particularly susceptible to deregulation during early embryonal and neonatal development, puberty, and especially old age.”56 They also found that certain genes that contribute to β-amyloid processing showed “significant inter-individual epigenetic variability,” a finding that they argued may be associated with susceptibility for AD.57 Another study found that the brain tissue obtained from identical twins had markedly different levels of DNA methylation. One twin was diagnosed with AD at aged 60 and died 16 years after the diagnosis, and the other died aged 79, with no signs of dementia. The twin who was demented had significantly lower DNA methylation in his brain tissue than that of his twin. It was established that the twin who had become demented had been exposed extensively to high pesticide levels as a result of his earlier employment, but it is not yet known if such exposure can be considered as a contributor to the onset of AD.58 It is argued that these differences can be explained by “epigenetic drift” caused by environmental exposure, lifestyle, diet, drug abuse, or merely stochastic fluctuations. Wang and coauthors point out that

epigenetic modifications may exert only subtle effects on the regulation of specific genes. Thus, abnormal DNA methylation may only cause a disease phenotype when several loci are affected at the same time.59

They add that people predisposed to AD as a result of other variables, including the APOE genotype, may be pushed over a critical threshold after which the brain starts to malfunction. They conclude that late-onset AD

may represent merely an extreme form of normal aging, which would imply that every human being has a certain predisposition to developing Alzheimer’s. In our model, the epigenetic effects can accumulate throughout life, especially from the time-point when the epigenetic machinery suffers from old age, but also from early embryonal stages or even trans-generational [effects], influenced by epigenetic events in the parents.

The part of the epigenetic system in which methylation and related activities take place is known as the chromatin-marking system. Chromatin is the “stuff of chromosomes”—the DNA, plus proteins, and other molecules associated with it.60 The structure of chromatin is crucial in the activation of genes, and enables states of gene activity or inactivity to be perpetuated in cell lineages. Alternative hereditable differences in chromatin have come to be known as “chromatin marks,” among which DNA methylation is the best recognized—a feature found in all vertebrates, plants, and also many invertebrates, fungi, and even some bacteria. It has been known for decades that methylation processes are crucial to normal development, and several publications have appeared over the years demonstrating methylation effects on neurodevelopment in experimental animals. The findings of Wang and colleagues are insightful but not unprecedented, and earlier findings similarly suggest that links exist between methylation patterns early in life and AD incidence in later life.61 A recent article titled “The Aging Epigenome” provides an excellent technical summary of this topic.62 However, based on these findings, the AD story will continue to be one of embodied molecular interactions, with the addition only of a time dimension provided by molecularized developmental processes. What remains poorly investigated in a systematic manner are those variables external to the body that no doubt influence the aging process both directly and indirectly. Thus far association studies alone exist, and the time is ripe to design research projects in which macro and micro variables are examined in tandem, challenging though this may be.

Meanwhile well-publicized findings that appear at regular intervals about newly discovered genetic mutations tend to drive a reductionistic approach, one strongly fueled by drug company interests, that without fail captures the imagination of the public. One such finding first appeared online in the New England Journal of Medicine in November 2012 detailing research that had shown that heterozygous rare variants of the gene TREM2 are associated with a significant increase in risk for Alzheimer’s disease.63 Two groups of researchers, one part of an international consortium known as the Alzheimer Genetic Analysis Group, and a second associated with deCode Genetics, independently published the same finding in the same issue of the journal. Both studies concluded that individuals with this rare mutation have a threefold to fivefold lifetime increase in risk of developing AD as compared to the general population—a risk that compares with that of APOEε4. This mutation affects the action of white blood cells in the brain, thus interfering with the immune system in such a way that toxic amyloid cannot be effectively eliminated. However, the TREM2 mutations occur in less than 2% of Alzheimer patients, and hence screening patients is not realistic. Even so, a senior researcher at Washington University involved in the research, Alison Goate, noted for the New York Times, “The field is in desperate need of new therapeutic agents … this will give us an alternative approach,” that is, presumably, an approach directed at maintaining the functioning of the immune system.64

It is highly likely that as GWA studies proceed apace more rare variants will be found at regular intervals—Rudolph Tanzi’s predictions about the value of GWAS appear to be proving correct. Should more rare variants be found, it will provide further stunning evidence of the enormous complexity at work in the Alzheimer syndrome.

In the final chapter that follows I revisit the tensions set out in Orientations and reconsider them in light of the so-called paradigm shift to Alzheimer prevention that has permeated much of the discussion in previous chapters.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.114.94