24. The Owner's Manual: Repairing the Damage and Restoring Health
Prior to 1900, almost all of the corn eaten in the US was processed using the Native American alkali or lime method (called nixtamalization). Shortly after 1900, corn processing changed to the degermination method in which the germ (corn embryo) was removed prior to milling the grain. The main purpose for the change was that the oils in the germ tended to degrade quickly and the shelf life of the ground corn was short. The rancid oils gave the corn a bad smell. (To experience the smell of rancid corn oil, open an old bag of corn chips and breathe deeply.). Unfortunately, the germ also contains all of the beneficial oils and nutrients other than starch.
Almost immediately, an interesting disease broke out (mostly) in the southern United States among the poorest people in the region: pellagra.[i] This disease is identified by the 3 D’s: Dermatitis, Diarrhea, and Delirium, which are followed by a fourth D: Death. Victims with advanced cases of pellagra were often committed to insane asylums as a result of the delirium, and the Southern asylums quickly became overcrowded with large numbers of patients.
With a death rate of about 40%, as many as 10,000 people were dying each year by 1915 and as many of 100,000 deaths from 3 million cases occurred over the course of about 30 years. Officials at every level looked high and low for the culprit, which they felt had to be a pathogen, but no infectious agent could be associated with the disease. In no small way, insistence by politicians and administrators on looking only for an infectious source was the largest obstacle to solving the problem.
The Southern states were still recovering from the disastrous cultural and economic effects of the Civil War and politicians were very sensitive to any implication that the South was anything but modern and equal to states in the North. Thus, any suggestion that pellagra could be a result of malnutrition in regions that were actually net producers of food was considered an insult and was summarily dismissed as an underlying cause of the problem.
However, the eventual discovery and later acceptance of the source of the problem was the basic diet of poor Southerners: the 3M diet, which was meat, molasses, and meal. The typical affordable diet among the poor was corn meal (starch) and cane syrup (sugar) supplemented by fatty pork when it was available. Ultimately, the cause of pellagra was found to be a near absence of niacin (Vitamin B3) and this was a result of the shift to the degermination method of processing corn, which eliminated the most common natural source of niacin (and tryptophan) in the 3M diet.
The pellagra epidemic is a good example of a subtle dietary change that had profound effects on public health. The change was no more than a modernization of the way we handled one common food component (corn) in our diet. Of course, as soon as the source of the problem was recognized, we stopped processing corn in that manner and the problem with pellagra disappeared. No, I’m just kidding. We really didn’t change anything. We still process most corn in that manner, but we make sure to add vitamins back into the meal. In fact, the reason so many of our grain-based foods are labeled as “enriched”, such as almost everything containing wheat flour, is precisely because of this legacy of pellagra and the removal of important nutrients for more efficient processing and a longer shelf life.
This, in fact, may be the source of our current health problems more than any other cause: the food industry is convinced (and fervently wishes to convince us) that food supplemented with nutrients is the same as the real deal. Michael Pollan has written extensively on this problem of looking at “food” as a combination of nutrients.[ii]
All of us are guilty of thinking that taking a supplement (for example, a daily vitamin) is more or less equivalent to balancing an otherwise unbalanced diet. We accept the constant barrage of advice from the food, drug, and supplement industries who suggest that merely adding a missing component is the same as eating actual food and that taking a medicine that replaces a function is the same as having the natural function. And now we are adding the microbiome as just another layer on top of the artificial edifice we call our modern diet.
The Big Picture is the Big Problem
As we try to get a handle on food and the microbiome and what that means for our health, we find there is an additional twist that we must recognize and respect. We are trying to juggle the multiple variables involved, but they all seem to be changing at the same time; the sand is shifting under our feet. We have modified every aspect of our social, cultural, and natural environment, but we are drifting further and further away from what our bodies understand naturally. While we may feel that making such adjustments is easy, our physiological and microbiome selves may not be nearly as amenable to some of these changes.
In some way or another most of the changes to our environment have been changes in chemistry. The microbiome is a chemical world; the bacteria in the microbiome live in and understand a chemical world. We, as humans, do not. We try hard to recognize which chemicals are directly harmful to our physiology, but we do not understand and will not understand until it is way too late just how chemicals affect us in indirect ways.
By the time we identify a complex health syndrome, we are typically too far into the weeds to be able to tease apart the many different factors that have contributed to the disease. It takes years to recognize the subtle environmental changes that lead to health complications and by the time we do, if we do, we have made many additional changes to the environment. The good news is that the microbiome can probably tell us within hours to weeks what chemical changes are negative. We have to listen though. And even more than that, we have to let the microbiome tell us what chemical changes are positive, and this is a focus we have neglected until today.
The fad diets, including the FDA/USDA guidelines, are all focused on food with the goal of avoiding “the bad stuff” and emphasizing “the good stuff.” Quite often, we think of a “good” diet as one that doesn’t have overtly “bad” things in it rather than approaching the issue in exactly the opposite way. The definition of good and bad has been notoriously difficult to determine. Pinpointing obviously “bad” foods is like labeling a particular bacterium as “bad”: that is, it’s entirely contextual.
Until recently, the American Medical Association had worked assiduously (and struggled mightily) to create and modify recommendations regarding the best practices for avoiding and reducing a serious health problem in America. Finally, the AMA threw in the towel and now just says, “we have no recommendation”. The bad guy? Dietary cholesterol.
After decades of chasing this hoodlum to collect evidence of wrongdoing, the medical and science detectives have failed to get a conviction. Why? Because cholesterol is a natural and absolutely necessary component of our system. Our bodies make cholesterol and if we don’t eat enough, our bodies compensate by making more. There are no effective ways to modify the diet to consistently and predictably lower blood cholesterol and yet, we know that high blood cholesterol is linked to cardiovascular disease. So, there is no solution? Are we doomed?
The AMA says simply: exercise more. Why? Because exercise changes the context. Exercise changes the internal physiological environment in a hundred different direct and indirect ways. And people who exercise routinely also change their lifestyle and that means what they eat, how much, and when. And the quality of the food goes up when we are paying attention to how food makes us feel and when we are taking responsibility for our health. Similarly, when we are conscious of our health, it is easier to follow the basic admonition: “excess is bad, moderation is good”.
The failure of the AMA to design written-in-stone recommendations for managing blood cholesterol should be seen as a positive outcome. It reflects a shift (albeit minor) in the medical world from attempting to explain our health as a function of single variables. Of course, there are single variables that greatly affect our health, such as pathogens like the Ebola virus and Bubonic Plague bacteria. But diseases that take years to manifest (such as cardiovascular diseases) and that only some people get are likely the result of a great many environmental factors. The recommendation to change your lifestyle is a recommendation to change your context and that means to change not one but a great many variables in a positive way. While many people who were hoping for a silver bullet cure for high cholesterol may not think so, I see the AMA recommendation as a success story.
Humans are complex. We are more complex than pure carnivores and pure herbivores. We are omnivores; we live in and experience a variable and unpredictable dietary environment. Michael Pollan has described this “dilemma” faced by omnivores in terms of choosing food as a source of confusion for our society. However, I suggest that omnivory allows us to live in that variable and unpredictable environment, but requires us to maintain a tremendously flexible digestive capacity. We can eat all manner of foods that are not explicitly toxic, but that resiliency and flexibility demands a rapid response system and one is capable of adjusting to novelty.
So, for humans, are there good and bad foods? We have been told daily (and for decades) that there are definitely bad foods. Yes, fats are bad! OK, but shouldn’t cultures that eat oily meats as a major part of their diet have died out? Well then, dairy fats! OK, but then cheese should have brought down the French. Ummm, coconut oil? Hmmm, then all Pacific islands should be uninhabited. Carbs from grains? Then all Middle Easterners are doomed.
Every culture has a food emphasis that has at one time or another been considered “bad”, but those cultures live on healthily. The Western world is obsessed with food choices because we are eliminating basic foods, historical choices, and nutritional quality. More to the point, we are being told that the food substitutes are equal to the real thing and we are being sold foods that appear to be the real thing, but aren’t.
The bottom line is that naturally occurring food is food. Food that we have been exposed to for hundreds and thousands of years is food. “Food” that has emerged in the past 70-100 years might not be food. And we are confused about the differences. There are two items that we must deal with regarding food. Both are crucial to living a healthy life in the modern world. First, we have to understand who we are when it comes to food. Second, we have to be able to recognize food.
[i] See Alfred Jay Bollet (1992) for a detailed summary of the pellagra epidemic. (Politics and Pellagra: The epidemic of pellagra in the U.S. in the early twentieth century. Yale Journal of Biology and Medicine 65: 211-221.) [ii] Michael Pollan. 2008. In Defense of Food. Penguin Press