17

Government Misinformation

The only good is knowledge, the only evil is ignorance.

—SOCRATES

Our federal government plays an important role in our health. It’s responsible for funding health research, approving drugs and treatments, determining nutritional recommendations for federal institutions and school lunch programs, and establishing rules for nutritional labeling, among many other things. In the United States, we are supposed to enjoy a government of the people, by the people, and for the people. This should translate to a government whose policies seek to maximize public health by finding, funding, and promoting the most effective means of prevention and treatment of disease. Unfortunately, that’s not the way things work.

I’m sad to say that in my experience around health policy and information, the people are getting the short end of the stick. We are being misled, with tragic consequences. The national debate on health-care reform wildly misses the mark, with Democrats and Republicans alike arguing about who’s going to pay rather than about what would actually make people healthy. National nutrition policy panders to wealthy corporate interests rather than objective science. Governmental health agencies all but ignore nutrition as a factor in public and individual health. If someone asked you to create public health policy for which the goal was to mislead the maximum number of people in ways that would compromise their health while profiting the pharmaceutical, medical, and junk food industries, you couldn’t do much better than what’s currently in place. As my friend Howard Lyman, a former rancher and agriculture industry lobbyist, has said, “We have the best government that money can buy.”

Are the people who create these policies so out of touch that they don’t realize the effects are the opposite of their stated goals? Hardly. With unrestricted access to government officials at all levels, industry applies a mix of carrots and sticks to produce our government’s pro-disease, pro-reductionist treatment policies that make them rich and the rest of us sick.

HOW INDUSTRY BOUGHT GOVERNMENT

Big Pharma, Big Insurance, and Big Medicine are among the biggest contributors to U.S. political candidates. According to the watchdog group OpenSecrets.org, health professionals (individual practitioners such as doctors, nurses, and nutritionists, plus large professional organizations such as the American Medical Association) ranked fourth in total giving to members of Congress in the 2011–2012 election cycle (almost $19 million), followed by the insurance industry at sixth (almost $15 million), and pharmaceuticals/health products at tenth (over $9 million).1 And that means they have significant leverage when it comes to guiding health policy: they can coordinate millions of dollars in donations for candidates whose policies they support, and can deploy additional millions to defeat candidates who don’t play ball. It was at an AMA convention that, in 2009, President Obama unveiled the public insurance option of his health-care reform plan.2

None of these industries have anything to gain by a more efficient and effective health-care system. To the contrary; if every American adopted a WFPB diet tomorrow, these industries would be in big trouble. You could argue that improving health care through nutrition and other lifestyle factors would even be “anti-growth,” making it practically anti-American. After all, when someone avoids the operating room because they adopted a healthy diet, they aren’t contributing to GDP. A diet of cheeseburgers, large fries, and Cokes is good for the economy when it’s purchased, but it’s even better when it leads to heart disease and a big hospital bill.

These industries can afford the best lobbyists, many of whom are hired for their connections as well as their persuasiveness. The “revolving door” between industries and the government agencies tasked with regulating them is spinning faster than ever.

Regulatory agencies routinely offer employment to industry lobbyists and so-called scientists who trade on their degrees to enhance their incomes. The departure of officials from government jobs for one in a related private-sector industry is common practice. In 2009, NIH director Dr. Elias Zerhouni resigned to take a position at Johns Hopkins University, according to a Johns Hopkins press release.3 He lasted only four months in that position before joining French pharmaceutical company Sanofi as their new head of research and development4—a career move that was conveniently omitted from the NIH website, in contrast to those former directors whose subsequent careers involved a return to academia.

In 2010, Dr. Julie Gerberding, who headed the CDC from 2002 to 2009, found gainful employment at Merck Vaccines shortly after departing government service.5 It’s a relationship that benefits Merck greatly, allowing it to capitalize on Dr. Gerberding’s contacts and influence in the federal government and the World Health Organization to help them sell more vaccines in the United States and around the world. But the career move also raises questions about impropriety. Certainly, at the very least, Dr. Gerberding’s push to vaccinate all Americans against the flu each year of her tenure at the CDC (earning her the nickname “Chicken Little” for her annual predictions of a flu pandemic that never materialized) must have endeared her to her future employer.

We don’t know; there isn’t any evidence Dr. Gerberding intentionally promoted a vaccination policy that would enrich her future employer. But if you’re a government official whose interest is in using vaccines as a primary strategy for controlling diseases like autism,6 it must be hard to ignore the fact that your tenure is short and, if you play your cards right, a private sector job could be awaiting you at the end of it. Coupled with health policies that look like they could have been written by pharmaceutical marketing departments, this built-in incentive to please industry should make us a little less trusting that government agencies are seeking our good above all else.

On the industry side, lobbyists do more than shake hands and buy drinks after golf. They also write and edit legislation and regulations for grateful, understaffed legislators and agency heads. Their job, for which industry richly rewards them, is to strike out any language that might jeopardize profits. And the politicians play ball to protect their own careers. This fact, while not publicized, is common knowledge in Congress and on K Street, where industry groups have their lobbying offices. I’ve met with many high-ranking government decision makers over the years. While they often acknowledge privately that my views on nutrition and health should be public policy, I have learned that the political system will punish any elected official who advocates serious diet and health reform. Corporate interests don’t just fund elections; they are willing and able to end political careers and derail progressive legislation as soon as they get a whiff of any move that might threaten their bottom line. And that means laws are enacted that further the interests of the wealthiest rather than the public good.

THE SO-CALLED HEALTH-CARE DEBATE

One of the hottest political debates of the past four years has been health-care reform. There’s no question that our health-care system is seriously broken. But when you look at the evidence offered in public discourse, you begin to realize that virtually everyone is missing the point: the primary reason our very costly health-care system is broken is because it doesn’t deliver health, and seems to have little interest in doing so. We’re paying way too much money for way too little health. Every other problem is a symptom arising from that core truth.

In recent years, a virtual army of writers, scholars, politicians, and business leaders has offered opinions and proposed programs to solve the “health-care problem.” Liberals point to the large numbers of uninsured people and insist the burden be shared by those who can afford to do so. Conservatives seek to protect the “free market” in health care, not realizing that this market is far from free. Sometimes the two sides find agreement, but such agreement is usually limited to how to streamline the delivery of health care.

For the most part, the debate over health care is focused on the supply side rather than the demand, with intense argument over who should pay the bill and not why the bill is so high.

We talk endlessly about shifting payment responsibilities among different groups—private sector or public sector, employer or employee—as if these programs are going to help control our country’s back-breaking health costs: about two and a half trillion dollars in 2009.7 Limiting these discussions and programs to matters of financing is too narrow. These political machinations, which are often fanned with much publicity and media coverage (or should I say hot air?), may please politicians and special interest groups from time to time, but they do little to address the main question of why we are so sick and why we are so unable to fix our sickness.

These discussions are not completely without consequence, however. They do serve to divert attention away from the really important question of how health might be improved—a question that leads directly to nutrition, not drugs and hospitals. Through this misdirection, they allow the system to continue to serve the profit motive at the expense of our health.

One of the best-known schemes intended for control of costs of health care is the HMO (health maintenance organization) legislation introduced in the 1990s. While health-care cost inflation slightly slowed for a couple of years with the introduction of HMOs, this trend proved short lived. Health-care costs have resumed their steady upward climb, with no new plateau in sight.

The initial savings generated by tough negotiations with doctors and efficiencies of scale did nothing to address the real problem: too many of us get sick, and the medical and pharmaceutical industries do a terrible job of making us well. Controlling costs is not the same thing as controlling disease. The HMOs talked about so-called preventive medicine, but in such a superficial way that the message had virtually no impact. Their dietary recommendations, by and large, boil down to “eat more veggies, drink fewer sodas, and choose leaner cuts of meat.” That’s like telling smokers to cut back from four packs a day to three—definitely a step in the right direction, but woefully inadequate. And because it was so superficial and inadequate, the “eat slightly better” message was universally ignored.

HMOs aren’t the last word in cost-cutting. When money gets too tight, some private-sector employers eliminate health insurance programs, cut jobs, and close shops, or send their businesses and jobs outside of the country, where they are often legally able to ignore worker health and eliminate such coverage. The movement of much of the U.S. auto industry from Detroit to Mexico is a case in point. General Motors attributes at least $1,500 of the cost of every new car made in the United States to employee health-care premiums.8 Ultimately, if we keep feeding the health-care monster everything we’ve got, it may bring down our entire economy.

HEALTH MISINFORMATION, COURTESY OF THE FEDERAL GOVERNMENT

We talked a little about the ways our government forwards the cause of reductionist nutrition in chapter five, focusing on the government’s nutrient databases and RDIs. But their reductionist nature is only part of the story.9

RDI information printed on food packaging represents one of the most powerful, ubiquitous, and enduring ways the federal government tells people what to eat and what to avoid. As I noted in chapter five, RDIs are the ultimate in reductionist nutrition. Most packages list about a dozen nutrients, as if those were the only ones, or the only ones that count. The recommended amounts are also listed as percentages of daily value in grams. Last I checked, Americans weren’t experts on metric weights or percentages. As we’ve seen, nutrition is nearly impossible to measure so precisely. And manufacturers are good at adjusting serving sizes to reduce the scary numbers of fat, sugar, and sodium—sometimes to zero, even though the product may contain a fair amount. In short, RDIs do a wonderful job of confusing the American public by appearing to be scientific while diverting attention from the simple truths about which foods support our health and which degrade it.

To make a bad system worse, for the vast majority of the population, most RDIs are much higher than they need to be. The establishment of the RDI for a nutrient generally begins with an assessment of the minimum amount of that nutrient needed to serve some particular function in the body for a sample group of individuals. This amount is sometimes referred to as the minimum daily requirement (MDR). For example, we might determine how much protein (measured as nitrogen) is needed to replenish the nitrogen lost by the sample group’s bodies each day. But because the resulting number represents only a very small sample of the whole population, the MDR is then adjusted upwards to ensure that the majority of the people (say, 98 percent) will meet their needs. This considerably higher number becomes the RDI.

So even if we accept that the MDR is an accurate representation of what we need to achieve total health (a very risky assumption on its own), when we consume the RDI amount for a nutrient, nearly 98 percent of us are theoretically exceeding our minimum nutrient requirements. In addition, most people, including most health professionals, incorrectly assume that these recommended allowances are minimum requirements. This assumption encourages us to consume more of these nutrients than we need, which benefits companies who sell nutrient-based products such as supplements, fortified foods, and nutraceuticals.

There’s more. These RDIs—as they are popularly interpreted—have in my experience long been biased on the high side for some nutrients to the point where they encourage the consumption of animal-based foods. Have you heard the myth that we need to consume lots of calcium to have strong bones and prevent osteoporosis? The calcium recommendation in the United States (1,200–1,300 mg/day) considerably exceeds the intake in countries that consume no dairy and less calcium (400–600 mg/day) but experience much lower rates of osteoporosis.10 Convincing evidence favors a recommendation for lower calcium intake, but, suffice it to say, the dairy industry has long had a strangling influence on the committee making these recommendations, urging these “unbiased experts” (their words) to accept a high-calcium RDI.11 The riboflavin (vitamin B2) recommendation has long been set high as well, with the additional but false understanding that dairy is a rich source of this vitamin—a myth that started in the 1950s.12 (In reality, dairy is not a rich source of riboflavin, at least as compared to certain plants.) In addition, the “daily value” for cholesterol is set at 300 mg/day. Cholesterol’s inclusion in this list implies that it is needed as a nutrient. It is not! Our bodies, on their own, produce all the cholesterol we need. Dietary cholesterol comes only from animal-based foods, and a far healthier recommendation would be zero!

Then there is the epic story of protein, a nutrient that has long been the government’s darling. The RDI for protein has for decades been 10–11 percent of calories, which is already more than enough (and not coincidentally, the average amount of protein consumed in a WFPB diet). Many people believe that a dietary average of 17–18 percent of calories from protein, also the current average level of protein consumption among Americans, is a good health practice. In 2002, the Food and Nutrition Board of the National Academy of Sciences (FNB) concluded, based on no credible evidence, that we can consume protein up to an astounding 35 percent of calories without health risk13—a number three times the longstanding RDI! At the time of the report, the director of the FNB was a major dairy industry consultant, and the majority (six out of eleven) of the members of a companion policy committee (the USDA “Food Pyramid” Committee) also had well-hidden dairy industry ties. Dairy groups even helped to fund the report itself. At this rate, before long, the government may start recommending a milk faucet in your kitchen next to the one for water.

The current system of developing and interpreting RDIs and guidelines according to industry interests is nothing less than shameful, not least because these industry-favoring standards and their supporting documents form the basis of so many government programs. These supposedly official items provide the scientific and political rationales for the way the national school lunch program, hospital meals, and Women, Infants, and Children programs are run.14

As a member of the expert panel that wrote the 1982 report on diet, nutrition, and cancer for the NAS, I recall that one of our central debates focused on what we should suggest as the appropriate goal for dietary fat to reduce cancer risk, based on existing evidence. Should we suggest reducing it to 30 percent of total calories (from the then 35–37 percent average), when the evidence clearly pointed to a much lower number? The debate was not about the evidence. Instead, we were worried about the political palatability of an honest dietary fat recommendation as low as 20 percent (still twice the level suggested by a WFPB diet). It was a statement that, thirty years ago, likely would have doomed our report to oblivion just on its own. Ultimately, we chose not to go lower than 30 percent, in deference to a prominent member of our panel from the USDA, who convinced us that doing so might result in a decrease in the consumption of protein and animal-based foods. That number, 30 percent, set the definition for a low-fat diet that remained part of the public narrative for many years thereafter. It gave the Atkins enthusiasts, among others, a false benchmark to use as a straw man in their argument that so-called low-fat diets don’t work. Our committee’s shading of the evidence in the policy statement in effect protected the animal foods industry and did nothing to promote human health.

While real nutrition is marginalized as a potential source of health, the federal government ignores and even covers up the truth about the deadly effects of the American medical system. As we saw in chapter one, the public CDC website conveniently omits the misfortunes of the medical system from the list of leading causes of death in the United States, despite the fact that “physician error, medication error and adverse events from drugs and surgery”15 is the third leading cause of death, trailing just heart disease and cancer. These are deaths caused by the medical system, almost half of which result from the adverse effects of prescription drugs.

You might argue that the reason drug- and surgery-related deaths aren’t included in the CDC list is because government has judged those death-by-health-care numbers to be incorrect; perhaps the researchers got it wrong. But this stark reality was summarized and reported in the prestigious Journal of the American Medical Association.16 A federal entity, the Agency for Healthcare Research and Quality of the U.S. Department of Health and Human Services, was given responsibility in 1999 of monitoring medical errors nationwide in most U.S. hospitals. They have been diligent in getting all U.S. hospitals to systematically monitor such information, and have accumulated data for about five years as of this writing. The trend so far suggests not only that these statistics are correct, but also that the number of “medical errors” is increasing. Further, this may only be “the tip of the iceberg” with respect to the total number of avoidable deaths. An analysis of a subset of all hospitalized Medicare patients, for example, concluded that from 2000 to 2002, “over 575,000 preventable deaths occurred” nationwide.17

This more recent report confirms that these errors remain a “leading” cause of death; in fact, the report’s authors agree that this number of deaths is so high that it should be considered an “epidemic.” How is it possible that this cause of death might be an epidemic in one government report and not even be listed on a separate government website as a leading cause of death? Of course, such publicity would be bad for the disease business—and if the U.S. government cares about one thing here, it’s the economic interests of the medical establishment, one of the leading donors to political candidates, parties, and political action committees.

THE CORPORATE AGENDA OF THE NIH

As we’ve discussed, the NIH devotes a microscopic amount of money to nutrition research, and most of that money supports reductionist studies on the effects of individual supplements, not whole foods. The NIH doesn’t get a lot of public press, but its influence on the direction of medical research is huge. Its $28 billion annual budget funds somewhere between 68 and 82 percent of all biomedical funding in the United States, and a considerable amount around the world. Its two biggest institutes, based on funding, are the NCI and the National Heart, Lung, and Blood Institute, corresponding to the two leading causes of death. Of course, there’s no Institute of Medical Error and Adverse Drug Effect Prevention, corresponding to the third leading cause! And, as I’ve mentioned, there’s no Institute of Nutrition.

The NIH is thought to be an objective research organization, but of course there’s no such thing as objectivity where funding priorities are concerned. Let’s take a moment and look, in brief, at the way taxpayer money is allocated by the U.S. Congress. After receiving testimony and a proposed budget from NIH officials, Congress provides money to NIH in its general budget. NIH then apportions the budget among the directors of its institutes, each of whom divides the money into different program areas. Since institutes at various levels in the appropriation process essentially compete against one another for funding, they tend to be highly sensitive to the interests of powerful members of Congress. Regardless of how enlightened any individual institute director might be, she or he still must devote the lion’s share of the money received to reductionist, profit-focused research, or else risk censure by Congressional representatives feeling their own financial pressure from industry lobbyists. There’s not much money available for the type of systems analysis that could help us reprioritize our health spending in more efficient and compassionate ways. And almost nothing remains for studies of the social impact of health policies—trivial stuff, such as how real people’s health is affected by RDIs and school lunch programs.

The NIH gives out money in the form of grants. The way they do this is by inviting qualified people to sit on grant application review panels and pass judgment on the many submitted proposals that are competing for the money. By “qualified,” the NIH means something more specific and pernicious than “professionally qualified to evaluate study design and research potential.” The people deemed qualified to pass judgment on research grant priorities are those who have been successful in getting NIH grant money in the past, a cycle that helps keep innovative wholistic research off the menu.

I have served on grant review panels both within NIH and nongovernmental cancer-research funding agencies. Several years ago, I was invited by two successive NCI directors to present my views on the link between cancer and nutrition in a Director’s Seminar that included the director and about fifteen members of his staff. My second presentation followed my then-recent proposal for a new research-grant review panel called “Nutrition and Cancer” in hopes of giving some emphasis to this important topic. Although this new panel had been created, its name was changed to “Metabolic Pathology,” thus negating its purpose. In my presentation, I expressed concern that this new name would obscure the goal of studying nutrition and its ability to prevent and reverse cancer—a phenomenon that I was demonstrating in my lab at that point, and that had been corroborated in humans in the China Study. I asked then-director Sam Broder why the word nutrition could not be in the title. After some heated discussion, he snapped, “If you keep talking this way, you can just go back to Cornell where you came from.” Broder insisted that they were already funding nutrition research, but clearly our definitions of “nutrition research” were different. The NIH’s nutrition research at that point comprised, as it does now, only about 2 to 3 percent of the total NCI budget, most of which was devoted to clinical trials of supplements. Two hours of discussion (all right, argument) got me nowhere.18

You can see the NIH’s reductionist agenda clearly in what is and isn’t included in its public pronouncements about the causes and future treatment options for currently “incurable” diseases. To cite an especially pertinent example of an NIH-funded project laden with reductionist philosophy, I turn again to the supposed link between AF and liver cancer. The NIH website includes a page on this relationship, which I accessed in March 2012, almost four decades after Len Stoloff (then chief of the FDA branch studying mycotoxin) and I first published our doubts about AF being a human carcinogen. This NIH page begins:

       For almost four decades, [National Institute of Environmental Health Sciences]-funded scientists have conducted research on the role in promoting liver cancer of aflatoxin, a naturally occurring toxin produced by mold. Their discovery of the genetic changes that result from aflatoxin exposure have led to a better understanding of the link between aflatoxin and cancer risk in humans. These discoveries are also being used in developing cancer prevention strategies . . . .

           NIEHS-funded scientists at the Massachusetts Institute of Technology were among the first to show that exposure to aflatoxin can lead to liver cancer. Their research also demonstrated that aflatoxin’s cancer-causing potential is due to its ability to produce altered forms of DNA called adducts.19

See the reductionist assumption: AF causes cancer by altering DNA—as if the process were that linear and uncomplicated and unmediated by thousands of other reactions and interactions! But let’s allow the NIH to continue (while continuing to ignore the dominating nutritional effect on the course of this disease):

       The Johns Hopkins University researchers are [. . .] the first to test the effectiveness of chlorophyllin, a derivative of chlorophyll that is used as an over-the-counter dietary supplement and food colorant, in reducing the risk of liver cancer in aflatoxin-exposed individuals. Studies conducted in Qidong, People’s Republic of China, showed that consumption of chlorophyllin at each meal resulted in a 55% reduction in the urinary levels of aflatoxin-related DNA adducts. The researchers believe that chlorophyllin reduces aflatoxin levels by blocking the absorption of the compound into the gastrointestinal tract. The results suggest that taking chlorophyllin, or eating green vegetables that are rich in chlorophyllin, may be a practical and cost-effective way of reducing liver cancers in areas where aflatoxin exposures are high.20

Researchers have identified a biomarker—something they can measure that supposedly relates to cancer development. In this case, the biomarker is the level of AF-related DNA adducts in the urine. And they’ve identified a single nutrient—chlorophyllin—that can, in a straightforwardly reductionist fashion, block absorption of these compounds in the gastrointestinal tract.

Notice two fairly astounding things about this paragraph? First, green vegetables are mentioned, but in a throwaway tone. It’s chlorophyllin that is “practical and cost-effective,” not spinach and broccoli and kale. The NIH is coming down in favor of eating more green vegetables to prevent cancer in a way that won’t actually undermine potential pill sales.

Second, this mechanism description relies on the completely unfounded assumption—not even acknowledged as such on the web page—that AF-related DNA adducts in urine correlate with cancer development. While it may be true, it’s by no means a sure thing; you can’t quantify cancer based on an adduct in urine any more than you can measure the amount of chocolate a child ate on Halloween by counting the candy wrappers in their bedroom trashcan.

The article concludes on a predictable note: the discovery of a gene that may explain why some people get liver cancer after AF exposure while others don’t:

       In an effort to identify the genetic underpinnings of liver cancer, the Johns Hopkins University team has discovered mutations in a critical cancer gene, known as p53, in the serum of individuals who later were diagnosed with the disease. This discovery may eventually lead to new strategies for the detection, prevention, and treatment of liver disease in susceptible individuals.21

To recap: Our medical research establishment, funded by our government, responds to the scourge of liver cancer by recommending we take a pill to reduce gastrointestinal absorption of a carcinogen that has been shown to have nothing to do with the disease, and by promising much more expensive research into gene therapy that may one day save us from our own faulty bodies. No mention of nutrition at all, unless you count it as a vehicle for a nutrient more easily obtained as a dietary supplement!

I worked for a time with the researcher who led the team at Johns Hopkins mentioned in the article’s conclusion. He is a chemist by training and, like most chemists, a reductionist in spirit. His journey into the question of what causes liver cancer began with a strong bias that the carcinogen AF is a major cause of human liver cancer (you’ll recall I also once thought this could be true, early in my career). Thus he was focused on monitoring possible AF contamination in food, which necessitated routine analyses of food. He also was quite excited about a potentially lucrative company that he and his colleagues were launching to do just this. In addition, he and other Johns Hopkins colleagues were setting up an NIH-sponsored clinical trial in China to test the assessment mentioned on this NIH webpage, that chlorophyllin and related drugs might prevent liver cancer.

It was at this point in his career that he collaborated with my research group as part of our project exploring AF’s connection to liver cancer. His laboratory had what I considered to be the best available method for analyzing urinary AF-to-DNA adducts as an estimator of AF exposure, and partnering with him enabled us to better assess its possible relationship with liver cancer mortality rates. Unfortunately for his interests (business and otherwise), there was no relationship—despite documenting AF exposure in three different ways and despite this being a more comprehensive survey on AF and human liver cancer than all other studies combined.22 He refused to coauthor the paper of these findings. Also, his intervention project, in which chlorophyllin was administered to people in rural China, was abandoned after about eight years of NIH funding with no results, to my knowledge.

However, none of this appears on the NIH webpage, and this absence opens the door to and even encourages a variety of lucrative business practices, not least of which are chemical assays to analyze for insignificant amounts of AF (as offered by the company the Johns Hopkins researcher was starting).

This is reductionism—and your tax dollars—at work. Rather than preventing cancer, the NIH’s approach actually serves as a psychological inoculation against true health: “There’s no need to change your diet. You can if you want, but it’s much easier and cheaper to take a pill. And don’t worry, we’ve practically solved the problem by identifying the liver cancer gene. Just give us a few more years and we’ll have a cure.” Comforting words, with serious consequences.

This is the end result of all the political maneuvering and financial pressure we’ve looked at in this chapter, a version of reality shaped more by the profit agendas of Big Pharma, supplement makers, hospitals, surgeons, and suppliers of processed food and industrial meat and dairy than the truth. If these forces can so strongly influence the pronouncements of a powerful government agency supposedly looking out for our best interests, how can we trust our government’s guidance on how to be healthy?

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
54.234.184.8