Here is a credible idea: “We should not eat things that we are maladapted to eat.” On the face of it, this makes perfect sense: if we are maladapted to eating something (or doing anything, for that matter), then clearly it will do us harm. That’s true pretty much by definition. Let’s call this argument the adaptational health argument.
Now, let’s shift the emphasis. The meaning of the statement is actually the same, but when restated, it reveals a problem: “We should only eat things that we are adapted to eat.”
Ah. Now, the shoe is on the other foot! What exactly are we adapted to eat? We have to eat something. How do we choose? What if we have been eating things that we are not adapted to eat?
Stated this way, this dictum has become the basis for at least three different prescriptive diets: the “Paleo Diet” the “Raw Diet” and the “Blood Type Diet” (plus numerous variations and at least one recommendation for the healthiest footwear). All three diets attempt to answer the question, “what are we adapted to eat?” in a way that sounds reasonable, but these prescriptions are not taken seriously by most biologists or doctors. Where is the problem?
I would like to argue that these diets are actually examples of the Naturalistic Fallacy that merely use evolutionary adaptation as a blind for the fallacy. Now, when I say Naturalistic Fallacy, I must apologize to philosophers who actually mean something somewhat different. For them, this fallacy has more to do with the “is-ought problem”, which is related but not identical to the concept I wish to discuss. No, the fallacy I’m referring to is the fallacy that natural things are inherently safer or more “healthy” than unnatural things (whatever “healthy” means).
Herbal medicine is a prime example of the naturalistic fallacy at work. There is nothing inherently wrong with using herbs to alleviate various health conditions, but they are not necessarily better than using man-made medicines. I will discuss why in a moment, but first, perhaps I should convince you of the truth of my statement. Consider the following substances and herbs:
Arsenic (an element)
Poison Ivy (a plant)
Carbon monoxide (produced in most fires, even natural ones)
Carbon dioxide (produced as a natural waste product by all animals)
Ammonia (produced as a waste product by fish)
Curare (found on the skin of poison dart frogs)
Lead (an element)
Mercury (an element)
Snake venom (usually produced by snakes)
Ephedra (An herb. Causes heart palpitations)
Ergot (a fungus)
alcohol (methanol and ethanol – produced by yeast)
Many of the above are deadly. Most will cause severe health issues. None is something you would want to ingest (or breathe) in significant proportions.
All are natural.
By contrast, many man-made things are safe enough to ingest simply because they are inert in the human body (e.g. silicone – once it solidifies, anyway).
The naturalistic fallacy has another problem: what exactly is natural? Meat is natural. Is it still natural when we cook it? Alcohol is natural (it’s found in rotten fruit), but is it natural when we distill it? Willow bark, a useful herbal remedy, is natural. Is its active component, salicin, still natural if we extract it in a tea? When we ingest salicin, our bodies naturally break it down into salicylic acid. Is salicylic acid natural if we make it in some other way? Finally, aspirin is salicylic acid with a couple of additional carbons and hydrogens (and one atom of oxygen) attached to it. It may not be natural, but it has been shown to be a beneficial – even life saving – remedy when used correctly.
We can even play the same game with water. Pure rain water is natural. Is distilled water natural? It’s made by the same process, but its production is managed by humans. What about water that is produced by burning hydrogen?
2H2 + O2 →H2O
Where is the dividing line?
The adaptational health argument attempts to answer these questions by imposing an evolutionary definition. “Natural” is what we are adapted to eat, drink or do. Natural things are healthy because we are adapted to them.
Although it is a tautology, this version of the naturalistic fallacy has one advantage: it allows us to eliminate some obvious counter-examples from the list of natural things above. For example, the adaptational health argument suggests that poison ivy, although natural, isn’t healthy for us because our ancestors never became adapted to eating it. If early humans ate enough poison ivy, perhaps they might have become adapted to it and we would be eating it today – but they didn’t, so we don’t.
Unfortunately, there are several problems with this line of reasoning. The first is that our terms are hopelessly ill-defined. We need to know what we were adapted to long ago, before we started doing artificial things. We want to know what our remote ancestors ate, but how remote do we have to get? Do we need to study neolithic humans? Paleolithic? What about “Cro-Magnon” humans?
Similarly, we want to exclude certain “artificial” dietary habits, but which inventions do we need to exclude? Some say that the important thing is to avoid the products of agriculture (which is usually interpreted to mean wheat), while others say even cooking food is a bad idea. But the use of fire – especially for cooking – goes all the way back to Homo erectus! How can we know enough about the eating habits of a 1.5 million year old species to be able to design a whole modern diet?
And, of course, we are assuming that “our” ancestors all ate the same things. I’m willing to bet that people from the paleolithic era of South-East Asia had a different diet from the stone age humans of Northern Europe. Or Central Africa. Or Australia. Which paleolithic diet should we follow? What if our herritage is mixed?
There is one diet that attempts to address this last point. In the blood type diet, it is postulated that different cultures became adapted to different foods. Since different populations also have different proportions of the various possible blood types (A+, B-, AB+, O-, etc.) it is suggested that blood type predicts the diet a person should eat. There are two problems here. First of all, there is no causal connection between blood type and diet. The effects of blood type seem to be limited almost entirely to the compatibility of blood transfusions or organ transplants. Secondly, not everybody in a give culture has the same blood type – and in fact there is a great deal of variation within any culture. Say I have a blood type that is most prevalent in Native Hawaiians. So what? Should I stop eating a European, bread-based diet because of this? Some Native Hawaiians have a blood type that is common in Europe. Does this mean that Native Hawaiian cuisine has been hurting them all along?
The next problem is our assumption of health. Even if we know exactly what our ancestors ate, we still need to establish that these same people lead long, healthy, happy lives. What was the average age of survival of a paleolithic person? Did they have any persistent health problems? Were they able to resist disease? Did they always feel full of vim and vigor, or did they just struggle through each day as best they could on what was available? Were they happy, or depressed by soul-crushing stress and monotony?
If we put these objections together, we come to the realization that what we cannot actually know what we are adapted to unless we study the specific effects of specific foods on specific people. In other words, we need to give people different kinds of food and see what makes them healthy. But in order to do this, we don’t need a theory of adaptational health! All we need to do is look at which foods seem to make us healthy now regardless of whether they made our ancestors healthy back then.
All of the above is fun to rant about, but (in the immortal words of Arlo Guthrie), that’s not what I came to talk about.
I came to talk about evolution.
The problems I’ve discussed so far are, mostly, methodological problems and inconvenient facts. They don’t attack the core rationale of the adaptational health program.
Adaptational health claims are based on a misunderstanding of evolution. No, I mis-spoke. They are based on at least three misunderstandings of evolution.
First of all, it is assumed that evolution must have stopped when agriculture (or cooking, or whatever “artificial” influence is being attacked) was invented. Alternatively, it sometimes assumed that we have been living our modern life for way to short a time to have adapted to it yet. There are two problems here. First of all, evolution does not stop just because we are affecting our own environment, and there is no reason to think that all evolution happens so slowly that we are still waiting to kind of catch up 10,000 or a million years later. If Homo erectus cooked food, and modern humans have evolved substantially since then (we no longer look like Homo erectus, for example), then certainly there has been enough time for us to adapt to eating cooked food.
Secondly, if we do a bit of a thought experiment, we’ll see that the idea of us not adapting to the use of agriculture (or cooking, or whatever) is kinda hard to believe. Imagine an early band of humans, going about their primitive lives, perfectly adapted to his environment. Along comes some genius who says, “Hey! We should grow wheat!” (or “We should cook this food!” or “We should only eat giraffes on Tuesdays because we all have type B+ blood”), and everyone in his clan switches over to this new way of doing things. What happens if they are not adapted?
Well, they don’t do as well as they had before. That’s what adapted means. It means that they have been shaped by natural selection to be more fit for their environment than they had been before. If their fitness dropped off because they had started farming (or whatever) then they would not be reproducing effectively, and would probably be out-competed by people who didn’t grow wheat (or whatever). That’s true by definition. In any event, they probably wouldn’t keep up the new practices for long.
The second misunderstanding is that adaptation is only an evolutionary event. I don’t mean to get semantic here, but there is another way to use the word adaptation: we can use it to mean plain old “adjustment.” We can adjust to new circumstances; our physiologies are somewhat variable. For example, you may not be used to living in a house where the temperature is kept low (perhaps to save money), but your body would get used to it if you had to. You would adapt your behavior, and you would get used to the difference in sensation. Similarly, it is not clear that there is any one perfect diet – even for a single person. Your body can make many of the compounds it needs from raw materials (amino acids, glucose, vitamins), so if someone starts eating a little less meat, or replaces apples with papayas, the body can usually shift its resources around to compensate. There is no reason to think that our ancestors could not compensate for any changes in the diet that they were “adapted” to (especially since they seem to have survived), and as I said above, no reason to think that we still cannot compensate ten thousand years later.
The third misunderstanding is the assumption that anything is perfectly “adapted” to their environment in the first place. Once again, this assertion fails on two fronts. Nothing is perfectly adapted; you don’t have to be perfect to survive and reproduce. More importantly, innovation often comes from a feature that an organism has “by accident” and which only later gets improved by natural selection. For example, a bird might have a tiny little hook on the end of its beak that it does not normally use. Perhaps the hook forms as an accidental effect of the bird’s embryonic development. But one day, the bird might encounter a new food source that is easier to catch or open or eat if it uses that little hook. If the new food source persists, then the hook will undergo selection until it gets bigger or sharper or otherwise more effective. Similarly, our adventurous ancestors (whichever ones we are arguing about today) might have had an “accidental” ability to eat and digest their innovative diet. Of course, we are dealing with speculation here. It’s very hard to know what they were accidentally able to eat. But it’s also hard to know what they were unable to eat, so we’re back to the need to individually and rigorously test each assertion of healthfulness on modern humans.
Does all of this mean that we should go ahead and eat our pizza and French fries without fear? Does evolution have no place in health and medicine? Of course not. We have plenty of evidence that a diet of grease and salt is bad for us even without evolutionary theory. These observations can be explained, post-hoc, by understanding what our ancestors were adapted to eating, and the theory can help guide us toward new hypotheses. For example, we know from world-wide health studies that eating sugar in large quantities can lead to diabetes. It is interesting to note, however, that before the advent of modern agriculture (with greenhouses and long-range shipping) sweet, sugary fruits were only available seasonally. Also, we can observe that some of our primate cousins like to gorge on fruits while they are in season, and then return to eating whatever else is available when the season ends. Perhaps this means that we are able to eat sugar, but not as part of a steady diet. Taking a break from sugar for several months a year might be beneficial. I don’t know, but it might be worth investigating.
What I do know is that it’s tricky to use use our ancestor’s diet to dictate what we should or shouldn’t be eating. You can’t argue from “what was” to “what ought to be.”
Which, come to think of it, is pretty close to the original meaning of the Naturalistic Fallacy…