Is White Rice a Yellow-Light or Red-Light Food?

Arsenic is not just considered to be a carcinogen; it’s also designated as a “nonthreshold carcinogen, meaning that any dose, no matter how small, carries some cancer risk”—so there really isn’t a “safe” level of exposure. Given that, it may be reasonable to “use the conservative ALARA” approach, reducing exposure As Low As Reasonably Achievable.

I have a low bar for recommending people avoid foods that aren’t particularly health-promoting in the first place. Remember when that acrylamide story broke, about the chemical found concentrated in french fries and potato chips? (See my video Acrylamide in French Fries for more.) My take was pretty simple: Look, we’re not sure how bad this acrylamide stuff is, but we’re talking about french fries and potato chips, which are not healthy anyway. So, I had no problem provisionally bumping them from my list of yellow-light foods into my red-light list, from “minimize consumption” to “ideally avoid on a day-to-day basis.”

One could apply the same logic here. Junk foods made out of brown rice syrup, rice milk, and white rice are not just processed foods, but also arsenic-contaminated processed foods, so they may belong in the red zone as red-light foods we should avoid. What about something like whole brown rice? That is more difficult, because there are pros to help outweigh the cons. I discuss this in my video Is White Rice a Yellow-Light or Red-Light Food?, where you can see a graphical depiction of my traffic light food system at 0:49.

The rice industry argues that the “many health benefits of rice consumption outweigh any potential risk,” which is the same sentiment you hear coming out of Japan about the arsenic-contaminated seaweed hijiki: Yes, “the cancer risk posed by hijiki consumption exceeds this acceptable [cancer risk] level by a factor of 10,” an order of magnitude, but the Japanese Ministry of Health stresses the “possible health benefits,” such as lots of fiber and minerals, as if hijiki was the only weed in the sea. Why not choose any of the other seaweeds and get all the benefits without the arsenic? So, when the rice industry says the “many health benefits of rice consumption outweigh any potential risk,” it’s as if brown rice was the only whole grain on the planet. Can’t you get the whole grain benefits without the risks by eating oatmeal, barley, or quinoa instead? Or, is there some unique benefit to rice, such that we really should try to keep brown rice in our diet?

Consumer Reports recommended moving rice to the yellow-light zone—in other words, don’t necessarily avoid it completely, but moderate your intake. The rice industry, in a fact sheet entitled “The Consumer Reports Article is Flawed,” criticized Consumer Reports for warning people about the arsenic levels in rice, saying “[t]here is a body of scientific evidence that establishes…the nutritional benefits of rice consumption; any assessment of the arsenic levels in rice that fails to take this information into account is inherently flawed and very misleading.” The rice industry cites two pieces of evidence. First, it asserts that rice-consuming cultures tend to be healthier, but is that because of, or despite, their white rice consumption? And what about the fact that rice-eating Americans tend to be healthier? Perhaps, but they also tend to eat significantly less saturated fat. So, once again, how do we know whether it’s because of—or despite—the white rice?

The rice industry could have cited the study I discuss at 3:12 in my video that showed that brown rice intake of two or more servings a week was associated with a lower risk of diabetes, but presumably, the reason it didn’t is because intake of white rice is associated with an increased risk of diabetes, and white rice represents 95 percent of the U.S. rice industry. Switching out a third of a serving of white rice a day for brown rice might lower diabetes risk by 16 percent, but switching out that same white rice for whole grains in general, like oats or barley, might work even better! So, other grains have about ten times less arsenic and are associated with even lower disease risk. No wonder the rice industry doesn’t cite this study.

It does cite the Adventist studies, though, and some in vitro data. For example, in a petri dish, as you can see at 4:05 in my video, there are rice phytonutrients that, at greater and greater doses, can inhibit the growth of colon cancer cells while apparently leaving normal colon cells alone, which is exciting. And, indeed, those who happened to eat those phytonutrients in the form of brown rice once or more a week between colonoscopies had a 40 percent lower risk of developing polyps. (The consumption of green leafy vegetables, dried fruit, and beans were also associated with lower polyp incidence.) But, the only reason we care about the development of polyps is that polyps can turn into cancer. But, there had never been studies on brown rice consumption and cancer…until now, which I discuss in my video Do the Pros of Brown Rice Outweigh the Cons of Arsenic?.


For those unfamiliar with my traffic light system, I talk about it in my book trailer. Check out How Not to Die: An Animated Summary.

Almost there! This is the corresponding article to the 12th in my 13-video series on arsenic in the food supply. If you missed any of the first 11 videos, see:

Ready for the finale? See Do the Pros of Brown Rice Outweigh the Cons of Arsenic?.

And you may be interested in Benefits of Turmeric for Arsenic Exposure.

In health,
Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

How Much Arsenic in Rice Is Too Much?

What are some strategies to reduce arsenic exposure from rice?

Those who are exposed to the most arsenic in rice are those who are exposed to the most rice, like people who are eating plant-based, gluten-free, or dairy-free. So, at-risk populations are not just infants and pregnant women, but also those who may tend to eat more rice. What “a terrible irony for the health conscious” who are trying to avoid dairy and eat lots of whole foods and brown rice—so much so they may not only suffer some theoretical increased lifetime cancer risk, but they may actually suffer arsenic poisoning. For example, a 39-year-old woman had celiac disease, so she had to avoid wheat, barley, and rye, but she turned to so much rice that she ended up with sky-high arsenic levels and some typical symptoms, including “diarrhea, headache, insomnia, loss of appetite, abnormal taste, and impaired short-term memory and concentration.” As I discuss in my video How Much Arsenic in Rice Is Too Much, we, as doctors, should keep an eye out for signs of arsenic exposure in those who eat lots of rice day in and day out.

As you can see at 1:08 in my video, in its 2012 arsenic-in-rice exposé, Consumer Reports recommended adults eat no more than an average of two servings of rice a week or three servings a week of rice cereal or rice pasta. In its later analysis, however, it looked like “rice cereal and rice pasta can have much more inorganic arsenic—a carcinogen—than [its] 2012 data showed,” so Consumer Reports dropped its recommendation down to from three weekly servings to a maximum of only two, and that’s only if you’re not getting arsenic from other rice sources. As you can see from 1:29 in my video, Consumer Reports came up with a point system so people could add up all their rice products for the week to make sure they’re staying under seven points a week on average. So, if your only source of rice is just rice, for example, then it recommends no more than one or two servings for the whole week. I recommend 21 servings of whole grains a week in my Daily Dozen, though, so what to do? Get to know sorghum, quinoa, buckwheat, millet, oatmeal, barley, or any of the other dozen or so common non-rice whole grains out there. They tend to have negligible levels of toxic arsenic.

Rice accumulates ten times more arsenic than other grains, which helps explain why the arsenic levels in urine samples of those who eat rice tend to consistently be higher than those who do not eat rice, as you can see at 2:18 in my video. The FDA recently tested a few dozen quinoa samples, and most had arsenic levels below the level of detection, or just trace amounts, including the red quinoas that are my family’s favorite, which I was happy about. There were, however, still a few that were up around half that of rice. But, overall, quinoa averaged ten times less toxic arsenic than rice. So, instead of two servings a week, following the Consumer Reports recommendation, you could have 20. You can see the chart detailing the quinoa samples and their arsenic levels at 2:20 in my video.

So, diversifying the diet is the number-one strategy to reduce exposure of arsenic in rice. We can also consider alternatives to rice, especially for infants, and minimize our exposure by cooking rice like pasta with plenty of extra water. We found that a 10:1 water-to-rice ratio seemed best, though the data suggest the rinsing doesn’t seem to do much. We can also avoid processed foods sweetened with brown rice syrup. Is there anything else we can do at the dining room table while waiting for federal agencies to establish some regulatory limits?

What if you eat a lot of fiber-containing foods with your rice? Might that help bind some of the arsenic? Apparently not. In one study, the presence of fat did seem to have an effect, but in the wrong direction: Fat increased estimates of arsenic absorption, likely due to the extra bile we release when we eat fatty foods.

We know that the tannic acid in coffee and especially in tea can reduce iron absorption, which is why I recommend not drinking tea with meals, but might it also decrease arsenic absorption? Yes, by perhaps 40 percent or more, so the researchers suggested tannic acid might help, but they used mega doses—17 cups of tea worth or that found in 34 cups of coffee—so it isn’t really practical.

What do the experts suggest? Well, arsenic levels are lower in rice from certain regions, like California and parts of India, so why not blend that with some of the higher arsenic rice to even things out for everybody?

What?!

Another wonky, thinking-outside-the-rice-box idea involves an algae discovered in the hot springs of Yellowstone National Park with an enzyme that can volatize arsenic into a gas. Aha! Researchers genetically engineered that gene into a rice plant and were able to get a little arsenic gas off of it, but the rice industry is hesitant. “Posed with a choice between [genetically engineered] rice and rice with arsenic in it, consumers may decide they just aren’t going to eat any rice” at all.


This is the corresponding article to the 11th in a 13-video series on arsenic in the food supply. If you missed any of the first ten videos, watch them here:

You may also be interested in Benefits of Turmeric for Arsenic Exposure.

Only two major questions remain: Should we moderate our intake of white rice or should we minimize it? And, are there unique benefits to brown rice that would justify keeping it in our diet despite the arsenic content? I cover these issues in the final two videos: Is White Rice a Yellow-Light or Red-Light Food? and Do the Pros of Brown Rice Outweigh the Cons of Arsenic?.

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

What White Blood Cell Count Should We Shoot for?

At the start of my video What Does a Low White Blood Cell Count Mean?, you can see what it looks like when you take a drop of blood, smear it between two pieces of glass, and view at it under a microscope: a whole bunch of little, round, red blood cells and a few big, white blood cells. Red blood cells carry oxygen, while white blood cells are our immune system’s foot soldiers. We may churn out 50 billion new white blood cells a day. In response to inflammation or infection, that number can shoot up to a 100 billion or more. In fact, pus is largely composed of: millions and millions of white blood cells.

Testing to find out how many white blood cells we have at any given time is one of the most common laboratory tests doctors order. It’s ordered it hundreds of millions of times a year. If, for example, you end up in the emergency room with abdominal pain, having a white blood cell count above about 10 billion per quart of blood may be a sign you have appendicitis. Most Americans fall between 4.5 and 10, but most Americans are unhealthy. Just because 4.5 to 10 is typical doesn’t mean it’s ideal. It’s like having a “normal” cholesterol level in a society where it’s normal to die of heart disease, our number-one killer. The average American is overweight, so if your weight is “normal,” that’s actually a bad thing.

In fact, having excess fat itself causes inflammation within the body, so it’s no surprise that those who are obese walk around with two billion more white cells per quart of blood. Given that, perhaps obese individuals should have their own “normal” values. As you can see at 2:06 in my video, if someone with a 47-inch waist walks into the ER with a white blood cell count of 12, 13, or even 14, they may not have appendicitis or an infection. That may just be their normal baseline level, given all the inflammation they have in their body from the excess fat. So, normal levels are not necessarily healthy levels.

It’s like smoking. As you can see at 2:31 in my video, if you test identical twins and one smokes but the other doesn’t, the smoker is going to end up with a significantly higher white cell count. In Japan, for example, as smoking rates have steadily dropped, so has the normal white count range. In fact, it’s dropped such that about 8 percent of men who have never smoked would now be flagged as having abnormally low white counts if you used a cut-off of 4. But, when that cut-off of 4 was set, most people were smoking. So, maybe 3 would be a better lower limit. The inflammation caused by smoking may actually be one of the reasons cigarettes increase the risk of heart attacks, strokes, and other inflammatory diseases. So, do people who have lower white counts have less heart disease, cancer, and overall mortality? Yes, yes, and yes. People with lower white blood cell counts live longer. Even within the normal range, every one point drop may be associated with a 20 percent drop in the risk of premature death.

As you can see at 3:39 in my video, there is an exponential increase in risk in men as white count goes up, even within the so-called normal range, and the same is found for women. The white blood cell count is a “stable, well-standardized, widely available and inexpensive measure of systemic inflammation.” In one study, half of the women around 85 years of age who had started out with white counts under 5.6 were still alive, whereas 80 percent of those who started out over 7 were dead, as you can see at 4:05 in my video—and white blood cell counts of 7, 8, 9, or even 10 would be considered normal. Being at the high end of the normal range may place one at three times the risk of dying from heart disease compared to being at the lower end.

The same link has been found for African-American men and women, found for those in middle age, found at age 75, found at age 85, and found even in our 20s and 30s: a 17 percent increase in coronary artery disease incidence for each single point higher.

As you can see at 5:00 in my video, the higher your white count, the worse your arterial function may be and the stiffer your arteries may be, so it’s no wonder white blood cell count is a useful predictor of high blood pressure and artery disease in your heart, brain, legs, and neck. Even diabetes? Yes, even diabetes, based on a compilation of 20 different studies. In fact, it may be associated with everything from fatty liver disease to having an enlarged prostate. And, having a higher white blood cell count is also associated with an increased risk of dying from cancer. So, what would the ideal range be? I cover that in my video What Is the Ideal White Blood Cell Count?.

A higher white blood cell count may be an important predictor for cardiovascular disease incidence and mortality, decline in lung function, cancer mortality, all-cause mortality, heart attacks, strokes, and premature death in general. This is no surprise, as the number of white blood cells we have circulating in our bloodstreams are a marker of systemic inflammation. Our bodies produce more white blood cells day to day in response to inflammatory insults.

We’ve known about this link between higher white counts and heart attacks since the 1970s, when we found that higher heart attack risk was associated with higher white blood cell counts, higher cholesterol levels, and higher blood pressures, as you can see at 0:53 in my video What Is the Ideal White Blood Cell Count?. This has been found in nearly every study done since then. There are decades of studies involving hundreds of thousands of patients showing dramatically higher mortality rates in those with higher white counts. But why? Why does white blood cell count predict mortality? It may be because it’s a marker of inflammation and oxidation in the body. In fact, it may even be a biomarker for how fast we are aging. It may be more than just an indicator of inflammation—it may also be an active player, contributing directly to disease via a variety of mechanisms, including the actual obstruction of blood flow.

The average diameter of a white blood cell is about seven and a half micrometers, whereas our tiniest vessels are only about five micrometers wide, so the white blood cell has to squish down into a sausage shape in order to squeeze through. When there’s inflammation present, these cells can get sticky. As you can see at 2:20 in my video, a white blood cell may plug up a vessel as it exits a small artery and tries to squeeze into a capillary, slowing down or even momentarily stopping blood flow. And, if it gets stuck there, it can end up releasing all of its internal weaponry, which is normally reserved for microbial invaders, and damage our blood vessels. This may be why in the days leading up to a stroke or heart attack, you may find a spike in the white cell count.

Whether white count is just a marker of inflammation or an active participant, it’s better to be on the low side. How can we reduce the level of inflammation in our body? Staying away from even second-hand smoke can help drop your white count about half of a point. Those who exercise also appear to have an advantage, but you don’t know if it’s cause and effect unless you put it to the test. In one study, two months of Zumba classes—just one or two hours a week—led to about a point and a half drop in white count. In fact, that may be one of the reasons exercise is so protective. But is that just because they lost weight?

Fitness and fatness both appear to play a role. More than half of obese persons with low fitness—51.5 percent—have white counts above 6.6, but those who are more fit or who have less fat are less likely to have counts that high, as you can see at 3:47 in my video. Of course, that could just be because exercisers and leaner individuals are eating healthier, less inflammatory diets. How do we know excess body fat itself increases inflammation, increases the white count? You’d have to find some way to get people to lose weight without changing their diet or exercise habit. How’s that possible? Liposuction. If you suck about a quart of fat out of people, you can significantly drop their white count by about a point. Perhaps this should get us to rethink the so-called normal reference range for white blood cell counts. Indeed, maybe we should revise it downward, like we’ve done for cholesterol and triglycerides.

Until now, we’ve based normal values on people who might be harboring significant background inflammatory disease. But, if we restrict it to those with normal C-reactive protein, another indicator of inflammation, then instead of “normal” being 4.5 to 10, perhaps we should revise it closer to 3 to 9.

Where do the healthiest populations fall, those not suffering from the ravages of chronic inflammatory diseases, like heart disease and common cancers? Populations eating diets centered around whole plant foods average about 5, whereas it was closer to 7 or 8 in the United States at the time. How do we know it isn’t just genetic? As you can see at 5:38 in my video, if you take those living on traditional rural African diets, who have white blood cell counts down around 4 or 5, and move them to Britain, they end up closer to 6, 7, or even 8. Ironically, the researchers thought this was a good thing, referring to the lower white counts on the “uncivilized” diet as neutropenic, meaning having too few white blood cells. They noted that during an infection or pregnancy, when more white cells are needed, the white count came right up to wherever was necessary. So, the bone marrow of those eating traditional plant-based diets had the capacity to create as many white cells as needed but “suffers from understimulation.”

As you can see at 6:26 in my video, similar findings were reported in Western plant eaters, with an apparent stepwise drop in white count as diets got more and more plant based, but could there be non-dietary factors, such as lower smoking rates, in those eating more healthfully? What we need is an interventional trial to put it to the test, and we got one: Just 21 days of removing meat, eggs, dairy, alcohol, and junk affected a significant drop in white count, even in people who started out down at 5.7.

What about patients with rheumatoid arthritis who started out even higher, up around 7? As you can see at 7:03 in my video, there was no change in the control group who didn’t change their diet, but there was a 1.5 point drop within one month on whole food plant-based nutrition. That’s a 20 percent drop. That’s more than the drop-in inflammation one might get quitting a 28-year pack-a-day smoking habit. The most extraordinary drop I’ve seen was in a study of 35 asthmatics. After four months of a whole food plant-based diet, their average white count dropped nearly 60 percent, from around 12 down to 5, though there was no control group nor enough patients to achieve statistical significance.

If white blood cell count is such a clear predictor of mortality and is so inexpensive, reliable, and available, why isn’t it used more often for diagnosis and prognosis? Maybe it’s a little too inexpensive. The industry seems more interested in fancy new risk factors it can bill for.

I touch on the health of the rural Africans I discussed in How Not to Die from Heart Disease.


For more on fighting inflammation, see:

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

What About the Trans Fat in Animal Fat?

The years of healthy life lost due to our consumption of trans fats are comparable to the impact of conditions like meningitis, cervical cancer, and multiple sclerosis. But, if “food zealots” get their wish in banning added trans fats, what’s next? I explore this in my video Banning Trans Fat in Processed Foods but Not Animal Fat.

Vested corporate interests rally around these kinds of slippery slope arguments to distract from the fact that people are dying. New York Mayor Bloomberg was decried as a “meddling nanny” for his trans fat ban and attempt to cap soft drink sizes. How dare he try to manipulate consumer choice! But isn’t that what the food industry has done? “Soft drink portion sizes have grown dramatically, along with Americans’ waistlines.” In 1950, a 12-ounce soda was the king-sized option. Now, it’s the kiddie size. Similarly, with trans fats, it was the industry that limited our choice by putting trans fats into everything without even telling us. Who’s the nanny now?

New York City finally won its trans fat fight, preserving its status as a public health leader. “For example, it took decades to achieve a national prohibition of lead paint, despite unequivocal evidence of harm,” but New York City’s Board of Health led the way, banning it “18 years before federal action.”

There’s irony in the slippery slope argument: First, they’ll come for your fries; next, they’ll come for your burger. After the trans fat oil ban, one of the only remaining sources of trans fat is in the meat itself. “Trans fats naturally exist in small amounts in the fat in meat and milk,” as I’ve discussed before in my video Trans Fat in Meat and Dairy. Before the trans fat ban, animal products only provided about one fifth of America’s trans fat intake, but since the U.S. trans fat ban exempts animal products, they will soon take over as the leading source. As you can see at 2:09 in Banning Trans Fat in Processed Foods but Not Animal Fat, now that added trans fats are banned in Denmark, for example, the only real trans fat exposure left is from animal products found in the U.S. dairy, beef, chicken fat, turkey meat, lunch meat, and hot dogs, with trace amounts in vegetable oils due to the refining process.

The question is: Are animal trans fats as bad as processed food trans fats? As you can see at 2:38 in my video, a compilation of randomized interventional trials found that they both make bad cholesterol go up and they both make good cholesterol go down. So, both animal trans fats and processed food trans fats make the ratio of bad to good cholesterol go up—which is bad. Therefore, all trans fats cause negative effects “irrespective of their origin.” The researchers suspect that also removing natural trans fats from the diet could prevent tens of thousands of heart attacks, but unlike processed foods, you can’t remove trans fats from milk and meat because trans fats are there naturally.

The livestock industry suggests that a little bit of their trans fats might not be too bad, but you saw the same everything-in-moderation argument coming from the Institute of Shortening and Edible Oils after industrial trans fats were first exposed as a threat. The bottom line is “that intake of all sources of trans fat should be minimized.” The trans fat in processed foods can be banned, and just adhering to the current dietary guidelines to restrict saturated fat intake, which is primarily found in meat and dairy, would automatically cut trans fat intake from animal fats.

The reason no progress may have been made on animal trans fat reduction in Denmark is because The Danish Nutrition Council that pushed for the trans fat ban was a joint initiative of The Danish Medical Association and The Danish Dairy Board. They recognized that “the economic support from The Danish Dairy Council could be perceived as problematic” from a scientific integrity point of view, but, not to worry—“The Danish Medical Association expanded the Executive Board and the funding members to also include the Danish pork industry, the Danish meat industry, The Poultry and Egg Council and The Danish Margarine Industry Association.”

If people want to eat trans fat, isn’t that their right? Yes, but only if they’re informed about the risks—yet The Food Industry Wants the Public Confused About Nutrition.

For more on the industry pushback, see my video Controversy Over the Trans Fat Ban.

There does not appear to be a safe level of exposure to trans fat—or to saturated fat or dietary cholesterol, for that matter. See Trans Fat, Saturated Fat, and Cholesterol: Tolerable Upper Intake of Zero.


If you find these videos about industry influence on public policy compelling, check out my many others, including:

Note that the concept of raising or lowering HDL (the so-called good cholesterol) playing a causal role in heart disease has come into question. See Coconut Oil and the Boost in HDL “Good” Cholesterol.

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

The Food Industry’s “model of systemic dishonesty”

In 1993, the Harvard Nurses’ Health Study found that a high intake of trans fat may increase the risk of heart disease by 50 percent. That’s where the trans fat story started in Denmark, ending a decade later with a ban on added trans fats in 2003. It took another ten years before the United States even started considering a ban. All the while, trans fats were killing tens of thousands of Americans every year. With so many people dying, why did it take so long for the United States to even suggest taking action? I explore this in my video Controversy Over the Trans Fat Ban.

One can look at the fight over New York City’s trans fat ban for a microcosm of the national debate. Not surprisingly, opposition came from the food industry, complaining about “government intrusion” and “liken[ing] the city to a ‘nanny state.’” “Are trans fat bans…the road to food fascism?”

A ban on added trans fats might save 50,000 American lives every year, which could save the country tens of billions of dollars in healthcare costs, but not so fast! If people eating trans fat die early, think about how much we could save on Medicare and Social Security. Indeed, “smokers actually cost society less than nonsmokers, because smokers die earlier.” So, “we should be careful about making claims about the potential cost-savings of trans fat bans….more research is needed on the effects of these policies, including effects on the food industry.” Yes, we might save 50,000 lives a year, but we can’t forget to think about the “effects on the food industry”!

How about “education and product labeling” rather than “the extreme measure of banning trans fats”? As leading Danish cardiologist “puts it bluntly, ‘Instead of warning consumers about trans fats and telling them what they are, we’ve [the Danes] simply removed them.’” But we’re Americans! “As they say in North America: ‘You can put poison in food if you label it properly.’”

People who are informed and know the risks should be able to eat whatever they want, but that assumes they’re given all the facts, which doesn’t always happen “due to deception and manipulation by food producers and retailers.” And, not surprisingly, it’s the unhealthiest of foods that are most commonly promoted using deceptive marketing. It’s not that junk food companies are evil or want to make us sick. “The reason is one of simple economics”—processed foods simply “offer higher profit margins and are shelf-stable, unlike fresh foods such as fruit and vegetables.” The food industry’s “model of systemic dishonesty,” some argue, “justifies some minimal level of governmental intervention.”

But is there a slippery slope? “Today, trans fats; tomorrow, hot dogs.” Or, what about the reverse? What if the government makes us eat broccoli? This argument actually came up in the Supreme Court case over Obamacare. As Chief Justice Roberts said, Congress could start ordering everyone to buy vegetables, a concern Justice Ginsburg labeled “the broccoli horrible.” Hypothetically, Congress could compel the American public to go plant-based, however, no one can offer the “hypothetical and unreal possibility…of a vegetarian state” as a credible argument. “Judges and lawyers live on the slippery slope of analogies; they are not supposed to ski it to the bottom,” said one legal scholar.

If anything, what about the slippery slope of inaction? “Government initially defaulted to business interests in the case of tobacco and pursued weak and ineffective attempts at education” to try to counter all the tobacco industry lies. Remember what happened? “The unnecessary deaths could be counted in the millions. The U.S. can ill afford to repeat this mistake with diet.”

Once added trans fats are banned, the only major source in the American diet will be the natural trans fats found in animal fat. For more on this, see Banning Trans Fat in Processed Foods but Not Animal Fat and Trans Fat in Meat and Dairy.

Ideally how much trans fat should we eat a day? Zero, and the same goes for saturated fat and cholesterol. See Trans Fat, Saturated Fat, and Cholesterol: Tolerable Upper Intake of Zero, Good, Great, Bad, and Killer Fats, and Lipotoxicity: How Saturated Fat Raises Blood Sugar.


More on industry hysterics and manipulation in:

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

Armpit Shaving and Breast Cancer

Shaving before applying underarm antiperspirants can increase aluminum absorption. Could this explain the greater number of tumors and the disproportionate incidence of breast cancer in the upper outer quadrant of the breast near the armpit?

A famous case report called “The Mortician’s Mystery,” published in the New England Journal of Medicine back in the 1980s, described a man whose testicles started shrinking and breasts started growing. It turns out the mortician failed to wear gloves as he massaged embalming cream onto corpses. It was concluded there must have been an estrogenic compound in the cream that was absorbed through his skin into his body, one of the first such cases described.

This case was cited as inspiration by a group of researchers who came up with a new theory to explain a breast cancer mystery: Why do most breast cancers occur in the upper outer corner of the breast? The standard explanation was simply because that’s where most of the breast tissue is located, as the so-called tail of the breast extends up to the armpit, but that doesn’t explain the fact that it wasn’t always this way. Indeed, there has been a shift toward the appearance of breast cancer in the upper corner of the breast. And, it also doesn’t explain why “greater genomic instability”––chromosome abnormalities––has been “observed…in outer quadrants of the breast,” which may signal precancerous changes. There definitely seems to be something happening to that outer side of the breast, and it’s something relatively new, occurring in the last 50 years or so.

Is it possible that the increasing use of [underarm] antiperspirant which parallels breast cancer incidence could also be an explanation for the greater number of ductal tumours…and disproportionate incidence of breast cancer in the upper outer quadrant” of the breast near the site where stick, spray, or roll-on is applied? I discuss this possibility in my video Antiperspirants and Breast Cancer, where you can see a graph of U.S. breast cancer incidence and antiperspirant/deodorant sales at 1:38.

There is a free flow of lymph fluid back and forth between the breast and the armpit. If you measure aluminum levels in breasts removed during mastectomies, the “aluminum content of breast tissue in the outer regions [near the armpits]…was significantly higher,” presumably due to the “closer proximity to the underarm” area.

This is a concern because, in a petri dish at least, it has been demonstrated that aluminum is a so-called metalloestrogen, having pro-estrogenic effects on breast cancer cells. Long-term exposure of normal breast tissue cells in a test tube to aluminum concentrations in the range of those found in breasts results in precancerous-type changes. Then, as you can see at 2:41 in my video, once the cells have turned, those same concentrations “can increase the migratory and invasive activity of…human breast cancer cells” in a petri dish. This is important because women don’t die from the tumor in the breast itself, “but from the ability of the cancer cells to spread and grow at distant sites,” like the bones, lungs, liver, or brain. But, we don’t care about petri dishes. We care about people.

In 2002, a paper was published in the Journal of the National Cancer Institute in which the underarm antiperspirant habits of 800 breast cancer survivors were compared with those of women who had never gotten breast cancer, the first study of its kind. The finding? No indication of a link between the two.

Based on this study, Harvard Women’s Health Watch assured women that antiperspirants do not cause breast cancer and “women who are worried that antiperspirants might cause breast cancer can finally rest easy.” But two months later, another study was published that found that “frequency and earlier onset of antiperspirant/deodorant usage with underarm shaving was associated with an earlier age of breast cancer diagnosis.” As you can see at 3:56 in my video, it’s as much as 20 years earlier in women using antiperspirant and shaving their armpits more than three times a week. And, the earlier they started before versus after age 16 appeared to move up their breast cancer diagnosis by 10 or 20 years. The researchers concluded that “underarm shaving with antiperspirant/deodorant use may play a role in breast cancer” after all.

But what does shaving have to do with it? Shaving removes more than just armpit hair. It also removes armpit skin; you end up shaving off the top skin layer. And, while there is very little aluminum absorption through intact skin, when you strip off the outer layer with a razor and then rub on an antiperspirant, you get a six-fold increase in aluminum absorption through the skin. Though this is good news for women who don’t shave, the high transdermal, or through-the-skin, aluminum uptake on shaved skin “should compel antiperspirant manufacturers to proceed with the utmost caution.”

Both European safety authorities and the U.S. Food and Drug Administration (FDA) specifically advise against using aluminum antiperspirants on damaged or broken skin. However, shaving before antiperspirant application “can create abrasions in the skin…thereby negating the specific warning by the FDA and EU.” (I’m sure everyone knows about the FDA’s cautionary advice, having read Title 21 Part 350 Subpart C50-5c1 of the Code of Federal Regulations.)

We get so much aluminum in our diet from processed foods—such as anticaking agents in pancake mix, melting agents in American cheese, meat binders, gravy thickeners, baking powder, and candy—that the contribution from underarm antiperspirants would presumably be minimal in comparison. “But everything was turned topsy-turvy in 2004,” when a case was reported of a woman with bone pain and fatigue suffering from aluminum toxicity. Within months of stopping the antiperspirant, which she had been applying daily to her regularly shaved armpits, her aluminum levels came down and her symptoms resolved. Although not everyone absorbs that much aluminum, the case “suggests that caution should be exercised when using aluminum-containing antiperspirants frequently.”

Recently, as you can see at 6:29 in my video, it was shown that women with breast cancer have twice the level of aluminum in their breasts compared with women without breast cancer, though this doesn’t prove cause and effect. Maybe the aluminum contributed to the cancer, or maybe the cancer contributed to the aluminum. Maybe tumors just absorb more aluminum. Subsequent research has suggested this alternative explanation is unlikely. So, where do we stand now?

The latest review on the subject concluded that as a consequence of the new data, given that aluminum can be toxic and we have no need for it, “reducing the concentration of this metal in antiperspirants is a matter of urgency.” Or, at the very least, the label should warn: “Do not use after shaving.” Of course, we could cease usage of aluminum-containing antiperspirants altogether, but then wouldn’t we smell? Ironically, antiperspirants can make us stink worse. They increase the types of bacteria that cause body odor. It’s like the story with antidepressant drugs, which can actually make one more depressed in the long run (as I discuss in my video Do Antidepressant Drugs Really Work?). The more we use antiperspirants, the more we may need them, which is awfully convenient for a billion-dollar industry.

Is there any way to decrease body odor through changes in diet? An early video of mine discusses Body Odor and Diet, and I have some new updated ones coming down the pike!


What else can we do to decrease breast cancer risk? See, for example:

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

What Exercise Authorities Don’t Tell You About Optimal Duration

Physical fitness authorities seem to have fallen into the same trap as the nutrition authorities, recommending what they think may be achievable, rather than simply informing us of what the science says and letting us make up our own minds.

Researchers who accept grants from The Coca-Cola Company may call physical inactivity “the biggest public health problem of the 21st century,” but, in actually, physical inactivity ranks down at number five in terms of risk factors for death in the United States and even lower in terms of risk factors for disability, as you can see at 0:17 in my video How Much Should You Exercise? What’s more, inactivity barely makes the top ten globally. As we’ve learned, diet is our greatest killer by far, followed by smoking.

Of course, that doesn’t mean you can just sit on the couch all day. Exercise can help with mental health, cognitive health, sleep quality, cancer prevention, immune function, high blood pressure, and life span extension, topics I cover in some of my other videos. If the U.S. population collectively exercised enough to shave just 1 percent off the national body mass index, 2 million cases of diabetes, one and a half million cases of heart disease and stroke, and 100,000 cases of cancer might be prevented.

Ideally, how much should we exercise? The latest official “Physical Activity Guidelines for Americans” recommends adults get at least 150 minutes a week of moderate aerobic exercise, which comes out to be a little more than 20 minutes a day. That is actually down from previous recommendations from the Surgeon General, as well as from the Centers for Disease Control and Prevention (CDC) and the American College of Sports Medicine, which jointly recommend at least 30 minutes each day. The exercise authorities seem to have fallen into the same trap as the nutrition authorities, recommending what they think may be achievable, rather than simply informing us what the science says and letting us make up our own minds. They already emphasize that “some” physical activity “is better than none,” so why not stop patronizing the public and just tell everyone the truth?

As you can see at 2:16 in my video, walking 150 minutes a week is better than walking 60 minutes a week, and following the current recommendations for 150 minutes appears to reduce your overall mortality rate by 7 percent compared with being sedentary. Walking for just 60 minutes a week only drops your mortality rate about 3 percent, but walking 300 minutes weekly lowers overall mortality by 14 percent. So, walking twice as long—40 minutes a day compared with the recommended 20 daily minutes—yields twice the benefit. And, an hour-long walk each day may reduce mortality by 24 percent. I use walking as an example because it’s an exercise nearly everyone can do, but the same applies to other moderate-intensity activities, such as gardening or cycling.

A meta-analysis of physical activity dose and longevity found that the equivalent of about an hour a day of brisk walking at four miles per hour was good, but 90 minutes was even better. What about more than 90 minutes? Unfortunately, so few people exercise that much every day that there weren’t enough studies to compile a higher category. If we know 90 minutes of exercise a day is better than 60 minutes, which is better than 30 minutes, why is the recommendation only 20 minutes? I understand that only about half of Americans even make the recommended 20 daily minutes, so the authorities are just hoping to nudge people in the right direction. It’s like the Dietary Guidelines for Americans advising us to “eat less…candy.” If only they’d just give it to us straight. That’s what I try to do with NutritionFacts.org.

Most of the content in my book How Not to Die came from my video research, but this particular video actually sprung from the book. I wanted to include exercise in my Daily Dozen list, but needed to do this research to see what was the best “serving size.”

I wish someone would start some kind of FitnessFacts.org website to review the exercise literature. I’ve got my brain full with the nutrition stuff—though there’s so much good information I don’t have time to review that there could be ten more sites just covering nutritional science!


For more on all that exercise can do for our bodies and minds, see

Some tips for maximizing the benefits:

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

The Crowding Out Strategy to Eating Healthier

It may be more expedient politically to promote an increase in consumption of healthy items rather than a decrease in consumption of unhealthy items, but it may be far less effective.

The World Health Organization has estimated that more than a million deaths “worldwide are linked to low fruit and vegetable consumption.” What can be done about it? I explore this in my video Is it Better to Advise More Plants or Less Junk?

There’s always appealing to vanity. A daily smoothie can give you a golden glow as well as a rosy glow, both of which have been shown to “enhance healthy appearance” in Caucasian, Asian, and African skin tones, as you can see at 0:24 in my video.

What about giving it away for free?

A free school fruit scheme was introduced in Norway for grades 1 through 10. Fruit consumption is so powerfully beneficial that if kids ate only an additional 2.5 grams of fruit a day, the program would pay for itself in terms of saving the country money. How much is 2.5 grams? The weight of half of a single grape. However, that cost-benefit analysis assumed this minuscule increased fruit consumption would be retained through life. It certainly seemed to work while the program was going on, with a large increase in pupils eating fruit, but what about a year after the free fruit program ended? The students were still eating more fruit. They were hooked! Three years later? Same thing. Three years after they had stopped getting free fruit, they were still eating about a third of a serving more, which, if sustained, is considerably more than necessary for the program to pay for itself.

There were also some happy side effects, including a positive spillover effect where not only the kids were eating more fruit, but their parents started eating more, too. And, although the “intention of these programs was not to reduce unhealthy snack intakes,” that’s exactly what appeared to happen: The fruit replaced some of the junk. Increasing healthy choices to crowd out the unhealthy ones may be more effective than just telling kids not to eat junk, which could actually backfire. Indeed, when you tell kids not to eat something, they may start to want it even more, as you can see at 2:20 in my video.

Which do you think worked better? Telling families to increase plants or decrease junk? Families were randomly assigned to one of two groups, either receiving encouragement to get at least two servings of fruits and veggies a day, with no mention of decreasing junk, or being encouraged to get their junk food intake to less than ten servings a week, with no mention of eating more fruits and veggies. What do you think happened? The Increase Fruit and Vegetable intervention just naturally “reduced high-fat/high-sugar intake,” whereas those in the Decrease Fat and Sugar group cut back on junk but didn’t magically start eating more fruits and vegetables.

This crowding out effect may not work on adults, though. As you can see at 3:12 in my video, in a cross-section of over a thousand adults in Los Angeles and Louisiana, those who ate five or more servings of fruits and veggies a day did not consume significantly less alcohol, soda, candy, cookies, or chips. “This finding suggests that unless the excessive consumption of salty snacks, cookies, candy, and sugar-sweetened beverages”—that is, junk—“is curtailed, other interventions…[may] have a limited impact….It may be politically more expedient to promote an increase in consumption of healthy items rather than a decrease in consumption of unhealthy items, but it may be far less effective.” In most public health campaigns, “messages have been direct and explicit: don’t smoke, don’t drink, and don’t take drugs.” In contrast, food campaigns have focused on eat healthy foods rather than cut out the crap. “Explicit messages against soda and low-nutrient [junk] foods are rare.”

In the United States, “if one-half of the U.S. population were to increase fruit and vegetable consumption by one serving each per day, an estimated 20,000 cancer cases might be avoided each year.” That’s 20,000 people who would not have gotten cancer had they been eating their fruits and veggies. The U.S. Department of Agriculture recommends we “fill half [our] plate with colorful fruits and vegetables,” but less than 10 percent of Americans hit the recommended daily target. Given this sorry state of affairs, should we even bother telling people to strive for “5 a day,” or might just saying “get one more serving than you usually do” end up working better? Researchers thought that “the more realistic ‘just 1 more’ goal would be more effective than the very ambitious ‘5 a day’ goal,” but they were wrong.

As you can see at 4:56 in my video, those told to eat one more a day for a week, ate about one more a day for a week, and those told to eat five a day for a week did just that, eating five a day for a week. But here’s the critical piece: One week after the experiment was over, the group who had been told to eat “5 a day” was still eating about a serving more, whereas the “just 1 more” group went back to their miserable baseline. So, more ambitious eating goals may be more motivating. Perhaps this is why “in the US ‘5 a day’ was replaced by the ‘Fruits and Veggies—More Matters’ campaign…in which a daily consumption of 7–13 servings of fruits and vegetables – FVs –  is recommended.” However, if the recommendation is too challenging, people may just give up. So, instead of just sticking with the science, policy makers evidently need to ask themselves questions like “How many servings are regarded as threatening?”


For more on appealing to vanity to improve fruit and vegetable consumption, see my videos Eating Better to Look Better and Beauty Is More Than Skin Deep.

What does the science say about smoothies? See:

The flipside of free fruit programs is to tax instead of subsidize. Learn more by checking out my video Would Taxing Unhealthy Foods Improve Public Health?

For more on the paternalistic attitude that you don’t care enough about your health to be told the truth, see my videos Everything in Moderation? Even Heart Disease? and Optimal Diet: Just Give It to Me Straight, Doc.

I explore this same patronizing attitude when it comes to physical activity in How Much Should You Exercise?

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

 

One Way to Treat Asthma and Autoimmune Diseases with Diet

Cutting two teaspoons of salt’s worth of sodium from one’s daily diet can significantly improve lung function in asthmatics

In the 1960s and 1970s, a mystery was emerging. Why were childhood asthma rates between 2 to 5 percent in the developed world but as low as 0.007 percent in the developing world? For example, in the developing world, instead of 1 in 20 kids affected, or even 1 in 50 kids, it could be more like 1 in 10,000 kids—extremely rare. And, when kids moved from a low-risk area to a high-risk area, their risk went up. What was going on? Were they exposed to something new? Did they leave some protective factor behind?

As I discuss in my video How to Treat Asthma with a Low-Salt Diet, all the way back in 1938, scientists showed they could stop asthma attacks by lowering children’s sodium levels. That was done with a diuretic drug, but subsequent dietary experiments showed that diets high in salt seemed to increase asthmatic symptoms, while “lowering the salt decreased the asthmatic symptoms…” This body of evidence was apparently forgotten…until it was picked up again in the 1980s as a possible explanation for why Western countries had higher asthma rates.

Maybe it was the salt.

As you can see at 1:34 in my video, researchers graphed out childhood death from asthma versus family salt purchases, and it seemed more salt meant more death. Just because a family buys more salt doesn’t necessarily mean the kids are eating more, though. The way to find out how much salt someone is eating is to collect their urine over a 24-hour period and measure the amount of sodium, since how much salt we eat is pretty much how much salt we excrete. The way to test for asthma, called a bronchial challenge test, is to look for an exaggerated response to an inhaled chemical. And, indeed, there was a strong correlation between how their lungs reacted and how much sodium they were taking in. However, there are all sorts of food additives, like preservatives, that can trigger these so-called hypersensitivity reactions, so maybe high sodium intake was just a marker for high processed food intake. Maybe it wasn’t the salt at all.

Or maybe it was other components of the diet. For example, the reason sodium may be a risk factor for another inflammatory disease, rheumatoid arthritis, may be that sodium intake is just a marker for increased fish and other meat intake or decreased fruit and vegetable intake. We needed a study where researchers would take asthmatics, change the amount of salt in their diets, and see what happened—and that’s just what came next.

As you can see at 3:16 in my video, researchers doubled the salt intake of ten asthmatics, and lung sensitivity worsened in nine out of ten. There was no control group, though. Is it possible the subjects would have gotten worse anyway?

In a randomized, double-blind, placebo-controlled trial, researchers put everyone on a low-salt diet, but then gave half of the subjects sustained-release sodium pills to bring their salt intake back up to a more normal level and the other half a placebo. After five weeks, the groups switched regimes for another five weeks. That’s how you can randomize people to a true low-sodium diet without them even realizing it. Genius! So what happened? Asthmatics on the salt got worse. Their lung function got worse, their asthma symptoms got worse, and they had to take more puffs on their inhalers. This study compared asthmatics consuming about three teaspoons’ worth of salt a day to those consuming less than one, so they were effectively able to drop their sodium intake by two teaspoons’ worth of salt, as you can see at 4:04 in my video. If you do a more “pragmatic” trial and only effectively reduce people’s salt intake by a half a teaspoon a day, it doesn’t work.

Even if you are able to cut down your sodium intake enough to get a therapeutic effect, though, it should be considered an adjunct treatment. Do not stop your asthma medications without your doctor’s approval.

Millions suffer from asthma attacks triggered by exercise. Within five minutes of starting to exercise, people can get short of breath and start coughing and wheezing such that lung function significantly drops, as you can see at 0:19 in my video Sodium and Autoimmune Disease: Rubbing Salt in the Wound?. On a high-salt diet, however, the attack is even worse, whereas on a low-salt diet, there’s hardly a significant drop in function at all. To figure out why, researchers had the subjects cough up sputum from their lungs and found that those on the high-salt diet had triple the inflammatory cells and up to double the concentration of inflammatory mediators, as you can see at 0:43 in my video. But why? What does salt intake have to do with inflammation? We didn’t know…until now.

“The ‘Western diet,’ high in saturated fatty acids and salt, has long been postulated as one potential…cause for the increasing incidence of autoimmune diseases in developed countries…” The rapidly increasing incidence of autoimmune diseases may be due to an overactivation of immune cells called T helper 17 (Th17) cells. “The development of…multiple sclerosis, psoriasis, type I diabetes, Sjögren’s syndrome, asthma, and rheumatoid arthritis are all shown to involve Th17-driven inflammation,” and one trigger for the activation of those Th17 cells may be elevated levels of salt in our bloodstream. “The sodium content of processed foods and ‘fast foods’…can be more than 100 times higher in comparison to similar homemade meals.”

And, sodium chloride—salt—appears to drive autoimmune disease by the induction of these disease-causing Th17 cells. It turns out there is a salt-sensing enzyme responsible for triggering the formation of these Th17 cells, as you can see at 2:07 in my video.

Organ damage caused by high-salt diets may also activate another type of inflammatory immune cell. A high-salt diet can overwork the kidneys, starving them of oxygen and triggering inflammation, as you can see at 2:17 in my video. The more salt researchers gave people, the more activation of inflammatory monocyte cells, associated with high-salt intake induced kidney oxygen deficiency. But that study only lasted two weeks. What happens over the long term?

One of the difficulties in doing sodium experiments is that it’s hard to get free-living folks to maintain a specific salt intake. You can do so-called metabolic ward studies, where people are essentially locked in a hospital ward for a few days and their food intake is controlled, but you can’t do that long term—unless you can lock people in a space capsule. Mars520 was a 520-day space flight simulation to see how people might do on the way to Mars and back. As you can see at 3:17 in my video, the researchers found that those on a high-salt diet “displayed a markedly higher number of monocytes,” which are a type of immune cell you often see increased in settings of chronic inflammation and autoimmune disorders. This may “reveal one of the consequences of excess salt consumption in our everyday lives,” since that so-called high-salt intake may actually just be the average-salt intake. Furthermore, there was an increase in the levels of pro-inflammatory mediators and a decrease in the level of anti-inflammatory mediators, suggesting that a “high-salt diet had a potential to bring about an excessive immune response,” which may damage the immune balance, “resulting in either difficulties on getting rid of inflammation or even an increased risk of autoimmune disease.”

What if you already have an autoimmune disease? In the study titled “Sodium intake is associated with increased disease activity in multiple sclerosis,” researchers followed MS patients for a few years and found that those patients eating more salt had three to four times the exacerbation rate, were three times more likely to develop new MS lesions in their brains, and, on average, had 8 more lesions in their brain—14 lesions compared to 6 in the low-salt group. The next step is to try treating patients with salt reduction to see if they get better. But, since reducing our salt intake is a healthy thing to do anyway, I don’t see why we have to wait.


What else can we do for asthma? See:

Have you heard that salt reduction was controversial? That’s what the processed food industry wants you to think. Check out the science in:

What are some of the most powerful dietary interventions we have for autoimmune disease? See, for example:

In health,

Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

Medical Meat Bias

When famed surgeon Michael DeBakey was asked why his studies published back in the 1930s linking smoking and lung cancer were ignored, he had to remind people about what it was like back then. We were a smoking society. Smoking was in the movies, on airplanes. Medical meetings were held in “a heavy haze of smoke.” Smoking was, in a word, normal. Even the congressional debates over cigarettes and lung cancer took place in literal smoke-filled rooms. (This makes me wonder what’s being served at the breakfast buffets of the Dietary Guidelines Committee meetings these days.)

I’ve previously talked about a famous statistician by the name of Ronald Fisher, who railed against what he called “propaganda…to convince the public that cigarette smoking is dangerous.” “Although Fisher made invaluable contributions to the field of statistics, his analysis of the causal association between lung cancer and smoking was flawed by an unwillingness to examine the entire body of data available…” His smokescreen may have been because he was a paid consultant to the tobacco industry, but also because he was himself a smoker. “Part of his resistance to seeing the association may have been rooted in his own fondness for smoking,” which makes me wonder about some of the foods nutrition researchers may be fond of to this day.

As I discuss in my video Don’t Wait Until Your Doctor Kicks the Habit, it always strikes me as ironic when vegetarian researchers are forthright and list their diet as a potential conflict of interest, whereas not once in the 70,000 articles on meat in the medical literature have I ever seen a researcher disclose her or his nonvegetarian habits––because it’s normal. Just like smoking was normal.

How could something that’s so normal be bad for you? And, it’s not as if we fall over dead after smoking one cigarette. Cancer takes decades to develop. “Since at that time most physicians smoked and could not observe any immediate deleterious effects, they were skeptical of the hypothesis and reluctant to accept even the possibility of such a relation”—despite the mountain of evidence.

It may have taken 25 years for the Surgeon General’s report to come out and longer still for mainstream medicine to get on board, but now, at least, there are no longer ads encouraging people to “Inhale to your heart’s content!” Instead, today, there are ads from the Centers for Disease Control and Prevention fighting back.

For food ads, we don’t have to go all the way back to old ads touting “Meat…for Health Defense” or “Nourishing Bacon,” or featuring doctors prescribing meat or soda, or moms relieved that “Trix are habit-forming, thank heavens!” You know things are bad when the sanest dietary advice comes from cigarette ads, as in Lucky Strike’s advertisements proclaiming “More Vegetables––Less Meat” and “Substitute Oatmeal for White Flour.” (You can see these vintage ads from 2:34 in my video).

In modern times, you can see hot dogs and sirloin tips certified by the American Heart Association, right on their packaging. And, of all foods, which was the first to get the Academy of Nutrition and Dietetics’ “Kids Eat Right” logo on its label? Was it an apple? Broccoli, perhaps? Nope, it was a Kraft prepared cheese product.

Now, just as there were those in the 1930s, 40s, and 50s at the vanguard trying to save lives, today, there are those transforming ads about what you can do with pork butt into ads about what the pork can do to your butt: “Hot Dogs Cause Butt Cancer—Processed meats increase colorectal cancer risk” reads an for the Physicians Committee for Responsible Medicine’s “Meat Is the New Tobacco” campaign, which you can see at 3:56 in my video. As Dr. Barnard, PCRM president, tried to convey in an editorial published in the American Medical Association’s Journal of Ethics, “Plant-based diets are the nutritional equivalent of quitting smoking.”

How many more people have to die before the Centers for Disease Control encourages people not to wait for open-heart surgery to start eating healthfully?

Just as we don’t have to wait until our doctor stops smoking to give up cigarettes ourselves, we don’t have to wait until our doctor takes a nutrition class or cleans up his or her diet before choosing to eat healthier. No longer do doctors hold a professional monopoly on health information. There’s been a democratization of knowledge. So, until the system changes, we have to take personal responsibility for our health and for our family’s health. We can’t wait until society catches up with the science again, because it’s a matter of life and death.

Dr. Kim Allan Williams, Sr., became president of the American College of Cardiology a few years back. He was asked why he follows his own advice to eat a plant-based diet. “I don’t mind dying,” Dr. Williams replied. “I just don’t want it to be my fault.”


I find this to be such a powerful concept that I have come at it from different angles. For other takes, check out Taking Personal Responsibility for Your Health and How Smoking in 1959 Is Like Eating in 2019. Are the health effects of smoking really comparable to diet, though? Check out Animal Protein Compared to Cigarette Smoking.

The food industry certainly uses the same kind of misinformation tactics to try to confuse consumers. See, for example:

In health,
Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations:

Does Aspartame Cause Lymphoma?

The approval of aspartame has a controversial history. The Commissioner of the U.S. Food and Drug Administration (FDA) concluded that “there is a reasonable certainty that human consumption of aspartame: (1) …will not pose a risk of brain damage resulting in mental retardation, endocrine [hormonal] dysfunction, or both; and (2) will not cause brain tumors.” However, the FDA’s own Public Board of Inquiry withdrew their approval over cancer concerns. “Further, several FDA scientists advised against the approval of aspartame, citing…[the aspartame company’s] own brain tumor tests…” Regardless, the Commissioner approved aspartame before he left the FDA and went on to enjoy a thousand-dollar-a-day consultancy position with the aspartame company’s PR firm. Then, the FDA actually prevented the National Toxicology Program (NTP) from doing further cancer testing. As I discuss in my video Does Aspartame Cause Cancer? we were then left with people battling over different rodent studies, some of which showed increased cancer risk, while others didn’t.

This reminds me of the saccharin story. That artificial sweetener caused bladder cancer in rats but not mice, leaving us “to determine whether humans are like the rat or like the mouse.” Clearly, we had to put the aspartame question to the test in people, but the longest human safety study lasted only 18 weeks. We needed better human data.

Since the largest rat study highlighted lymphomas and leukemias, the NIH-AARP study tracked blood cancer diagnoses and found that “[h]igher levels of aspartame intake were not associated with the risk of…cancer.” Although the NIH-AARP study was massive, it was criticized for only evaluating relatively short-term exposure. Indeed, people were only studied for five years, which is certainly better than 18 weeks, but how about 18 years?

All eyes turned to Harvard, where researchers had started following the health and diets of medical professionals before aspartame had even entered the market. “In the most comprehensive long-term [population] study…to evaluate the association between aspartame intake and cancer risk in humans,” they found a “positive association between diet soda and total aspartame intake and risks of [non-Hodgkin’s lymphoma] and multiple myeloma in men and leukemia in both men and women,” as you can see at 2:12 in my video. Why more cancer in men than women? A similar result was found for pancreatic cancer and diet soda, but not soda in general. In fact, the only sugar tied to pancreatic cancer risk was the milk sugar, lactose. The male/female discrepancy could have simply been a statistical fluke, but the researchers decided to dig a little deeper.

Aspartame is broken down into methanol, which is turned into formaldehyde, “a documented human carcinogen,” by the enzyme alcohol dehydrogenase.The same enzyme that detoxifies regular alcohol is the very same enzyme that converts methanol to formaldehyde. Is it possible men just have higher levels of this enzyme than women? Yes, which is why women get higher blood alcohol levels than men drinking the same amount of alcohol. If you look at liver samples from men and women, you can see significantly greater enzyme activity in the men, so perhaps the higher conversion rates from aspartame to formaldehyde explain the increased cancer risk in men? How do we test this?

Ethanol—regular alcohol—competes with methanol for this same enzyme’s attention. In fact, regular alcohol is actually “used as an antidote for methanol poisoning.” So, if this formaldehyde theory is correct, men who don’t drink alcohol or drink very little may have higher formaldehyde conversion rates from aspartame. And, indeed, consistent with this line of reasoning, the men who drank the least amounts of alcohol appeared to have the greatest cancer risk from aspartame.

A third cohort study has since been published and found no increased lymphoma risk associated with diet soda during a ten-year follow-up period. So, no risk was detected in the 18-week study, the 5-year study, or the 10-year study—only in the 18-year study. What should we make of all this?

Some have called for a re-evaluation of the safety of aspartame. The horse is kind of out of the barn at this point with 34 million pounds of aspartame produced annually, but that doesn’t mean we have to eat it, especially, perhaps, pregnant women and children.


For more information on the effects of aspartame, watch my videos Aspartame and the Brain and Aspartame-Induced Fibromyalgia. Interested in learning more about the effects of consuming diet soda? See, for example:

What about Splenda? Or monk fruit sweetener? I have videos on those, too—watch Effect of Sucralose (Splenda) on the Microbiome and Is Monk Fruit Sweetener Safe?.

I also do a comparison of the most popular sweeteners on the market, including stevia and xylitol, in my video A Harmless Artificial Sweetener.

Perhaps the best candidate is erythritol, which you can learn about in my video Erythritol May Be a Sweet Antioxidant. That said, it’s probably better if we get away from all intense sweeteners, artificial or not. See my video Unsweetening the Diet for more on this.

In health,
Michael Greger, M.D.

PS: If you haven’t yet, you can subscribe to my free videos here and watch my live presentations: