Eggs are bad for you. Wait, eggs are good for you! Fat is bad. Wait, fat is good and carbs are bad! Skipping breakfast causes weight gain. Wait, skipping breakfast (intermittent fasting) is great for weight loss and metabolic health.
It’s enough to make you crazy, right? These are just a few of the many contradictory nutrition claims that have been made in the media over the past decade, and it’s no wonder people are more confused about what to eat than ever before.
Everyone has an opinion on the optimal human diet—from your personal trainer to your UPS driver, from your nutritionist to your doctor—and they’re all convinced they are right. Even the “experts” disagree. And they can all point to at least some studies that support their view. On the surface, at least, all of these studies seem credible, since they’re published in peer-reviewed journals and come out of respected institutions like Harvard Public Health.
This has led to massive confusion among both the general public and health professionals, a proliferation of diet books and fad approaches, and a (justifiably) growing mistrust in public health recommendations and media reporting on nutrition.
Unfortunately, millions of dollars and decades of scientific research haven’t added clarity—if anything, they have further muddied the waters. Why? Because, as you’ll learn below, we’ve been asking the wrong questions, and we’re using the wrong methods.
If you’re confused about what to eat and frustrated by the contradictory headlines that are constantly popping up in your news feed, you’re not alone. The current state of nutritional research, and how the media reports on it, virtually guarantees confusion.
In this article, my goal is to step back and look at the question of what we should eat through a variety of lenses, including ancestral health, archaeology, anthropology, evolutionary biology, anatomy and physiology, and biochemistry—rather than relying exclusively on observational nutrition research, which, as I’ll explain below, is highly problematic (and that’s saying it nicely).
Armed with this information, you’ll be able to make more informed choices about what you eat and what you feed your family members.
Let’s start with the question that is on everyone’s mind …
What Is the Optimal Human Diet?
There isn’t one.
Note the emphasis on “one.”
When I explain this to people I talk to, they immediately understand. It makes sense to them that we shouldn’t all be following the exact same diet.
Yet that is exactly what public health recommendations and dietary guidelines assume, and I would argue that this fallacy is both the greatest source of confusion and the most significant obstacle to answering our key questions about nutrition.
Why? Because although human beings share a lot in common, we’re also different in many ways: we have different genes, gene expression, health status, activity levels, life circumstances, and goals.
Modern dieting advice is often confusing, contradictory, and just plain wrong. And, while there’s no such thing as one optimal human diet, there are some foods humans are designed to eat. Find out what should be on your plate—from a Paleo perspective.
Imagine two different people:
- A 55-year-old, sedentary male office worker who is 60 pounds Overweight and has pre-diabetes and hypertension
- A 23-year-old, female Olympic athlete who is training for three hours a day, in fantastic health, and is attempting to build muscle for a competition
Should they eat exactly the same diet? Of course not.
Our Differences Matter When It Comes to Diet
Although that may be an extreme example, it’s no less true that what works for a young, single, male, CrossFit enthusiast who is getting plenty of sleep and not under a lot of stress won’t work for a mother of three who also works outside the house and is burning the candle at both ends.
These differences—in our genes, behavior, lifestyle, gut microbiome, etc.—influence how we process both macronutrients (protein, carbohydrates, and fat) and micronutrients (vitamins, minerals, and trace minerals), which in turn determine our response to various foods and dietary approaches. For example:
- People with lactase persistence—a genetic adaptation that allows them to digest lactose, the sugar in milk, into adulthood—are likely to respond better to dairy products than people that don’t have this adaptation.
- Populations with historically high starch intake produce more salivary amylase than populations with low starch intake. (1)
- Changes to gut microbiota can help with the assimilation of certain nutrients. Studies of Japanese people have found, for example, that their gut bacteria produce specific enzymes that help them break down seaweed, which can be otherwise difficult for humans to digest. (2)
- Organ meats and shellfish are extremely nutrient dense and a great choice for most people—but not for someone with hemochromatosis, a genetic disorder that leads to aggressive iron storage, since these foods are so rich in iron.
- Large, well-controlled studies (involving up to 350,000 participants) have found that, on average, higher intakes of saturated fat are not associated with a higher risk of heart disease. (3) But is this true for people with certain genes that make them “hyper-absorbers” of saturated fat and lead to a significant increase in LDL particle number (a marker associated with greater risk of cardiovascular disease)?
This is just a partial list, but it’s enough to make the key point: there are important differences that determine what an optimal diet is for each of us, but those differences are rarely explored in nutrition studies. Most research on diet is almost exclusively focused on top-down, population-level recommendations, and since a given dietary approach will yield variable results among different people, this keeps us stuck in confusion and controversy.
It has also kept us stuck in what Gyorgy Scrinis has called “the ideology of nutritionism,” which he defines as follows: (4)
Nutritionism is the reductive approach of understanding food only in terms of nutrients, food components, or biomarkers—like saturated fats, calories, glycemic index—abstracted out of the context of foods, diets, and bodily processes.
In other words, it is a focus on quantity, not quality.
Nutrition research has assumed that a carbohydrate is a carbohydrate, a fat is a fat, and a protein is a protein, no matter what type of food they come packaged in. If one person eats 50 percent of calories from fat in the form of donuts, pizza, candy, and fast food and another person eats 50 percent of calories from fat in the form of whole foods like meat, fish, avocados, nuts, and seeds, they will still be lumped together in the same “50 percent of calories from fat” group in most studies.
Most people are shocked to learn that this is how nutrition research works. It doesn’t take a trained scientist to understand why this would be problematic.
But Aren’t There Some Foods That Are Better for All Humans to Eat (And Not Eat)?
I just finished explaining why there’s no “one-size-fits-all” approach to diet, but that doesn’t mean there aren’t core nutrition principles that apply to everyone.
For example, I think we can all agree that a steady diet of donuts, chips, candy, soda, and other highly processed and refined foods is unhealthy. And most people would agree that a diet based on whole, unprocessed foods is healthy.
It’s the middle ground where we get into trouble. Is meat good or bad? If it’s bad, does that apply to all meats, or just processed meat or red meat? What about saturated fat? Should humans consume dairy products?
A better question than “What is the optimal human diet?” then, might be “What is a natural human diet?” or, more specifically, “What is the range of foods that human beings are biochemically, physiologically, and genetically adapted to eat?”
In theory, there are two ways to answer this question:
- We can look at evolutionary biology, archaeology, medical anthropology, and comparative anatomy and physiology to determine what a natural human diet is.
- We can look at it from a biochemical perspective: what essential and nonessential nutrients contribute to human health (and where are they found in foods), how various functional components of food influence our body at the cellular and molecular level, and how certain compounds in foods—especially those prevalent in the modern, industrialized diet—damage our health via inflammation, disruption of the gut microbiome, hormone imbalance, and other mechanisms.
Let’s take a closer look through each of these lenses.
The Evolutionary Perspective
Human beings, like all other organisms in nature, evolved in a particular environment, and that evolutionary process dictated our biology and physiology as well as our nutritional needs.
Archaeological Evidence for Meat Consumption
Isotope analysis from archaeological studies suggests that our hominid ancestors have been eating meat for at least 2.5 million years. (5) There is also wide agreement that going even further back in time, our primate ancestors likely ate a diet similar to modern chimps, which we now know eat vertebrates. (6) The fact that chimpanzees and other primates evolved complex behavior like using tools and hunting in packs indicates the importance of animal foods in their diet—and ours.
Anatomical Evidence for Meat Consumption
The structure and function of the digestive tract of all animals can tell us a lot about their diet, and the same is true for humans. The greatest portion (45 percent) of the total gut volume of our primate relatives is the large intestine, which is good for breaking down fiber, seeds, and other hard-to-digest plant foods. In humans, the greatest portion of our gut volume (56 percent) is the small intestine, which suggests we’re adapted to eating more bioavailable and energy-dense foods, like meat and cooked starches, that are easier to digest.
Some advocates of plant-based diets have argued that humans are herbivores because of our blunt nails, small mouth opening, flat incisors and molars, and relatively dull canine teeth—all of which are characteristics of herbivorous animals. But this argument ignores the fact that we evolved complex methods of procuring and processing food, from hunting to cooking to using sharp tools to rip and tear flesh. These methods/tools take the place of anatomical features that serve the same function.
Humans have relatively large brains and small guts compared to our primate relatives. Most researchers believe that consuming meat and fish is what led to our larger brains and smaller guts compared to other primates because animal foods are more energy dense and easier to digest than plant foods. (7)
Genetic Changes Suggestive of Adaptation to Animal Foods
Most mammals stop producing lactase, the enzyme that breaks down lactose, the sugar in milk, after they’re weaned. But in about one-third of humans worldwide, lactase production persists into adulthood. This allows those humans to obtain nutrients and calories from dairy products without becoming ill. If we were truly herbivores that aren’t supposed to eat animal foods at all, we would not have developed this genetic adaptation.
Studies of Contemporary Hunter–Gatherers
Studies of contemporary hunter–gatherer populations like the Maasai, Inuit, Kitavans, Tukisenta, !Kung, Aché, Tsimané, and Hadza have shown that, without exception, they consume both animal and plant foods, and they go to great lengths to obtain plant or animal foods when they’re in short supply.
For example, in one analysis of field studies of 229 hunter–gatherer groups, researchers found that animal food provided the dominant source of calories (68 percent) compared to gathered plant foods (32 percent). (8) Only 14 percent of these societies got more than 50 percent of their calories from plant foods.
Another report on 13 field studies of the last remaining hunter–gatherer tribes carried out in the early and mid-20th century found similar results: animal food comprised 65 percent of total calories on average, compared with 35 percent from plant foods. (9)
The amount of protein, fat, and carbohydrates, the proportion of animals vs. plants, and the macronutrient ratios consumed vary, but an ancestral population following a completely vegetarian or vegan diet has never been discovered.
The Lifespan of Our Paleolithic Ancestors
Critics of Paleo or ancestral diets often claim that they are irrelevant because our Paleolithic ancestors all died at a young age. This common myth has been debunked by anthropologists. (10) While average lifespan is and was lower among hunter–gatherers than ours is today, this is heavily skewed by high rates of infant mortality (due to a lack of emergency medical care and other factors) in these populations.
The anthropologists Gurven and Kaplan studied lifespan in extant hunter–gatherers and found that, if they survive childhood, their lifespans are roughly equivalent to our own in the industrialized world: 68 to 78 years. (11) This is notable because hunter–gatherers today survive only in isolated and marginal environments like the:
- Kalahari Desert
- Amazon rainforest
- Arctic circle
What’s more, in many cases hunter–gatherers reach these ages without acquiring the chronic diseases that are so common in Western countries. They’re less likely to have heart disease, diabetes, dementia and Alzheimer’s, and many other debilitating, chronic conditions.
For example, one study of the Tsimané people in Bolivia found that they have a prevalence of atherosclerosis 80 percent lower than ours in the United States and that nine in 10 Tsimané adults aged 40 to 94 had completely clean arteries and no risk of heart disease. (12) They also found that the average 80-year-old Tsimané male had the same vascular age as an American in his mid-50s. (Did you notice that this study included adults up to age 94? So much for the idea that hunter–gatherers all die when they’re 30!)
When you put all of this evidence together, it suggests the following themes:
- Meat and other animal products have been part of the natural human diet for at least 2.5 million years
- All ancestral human populations that have been studied ate both plants and animals
- Human beings can survive on a wide variety of foods and macronutrient ratios within the general template of plants and animals they consumed
For a deeper dive on this topic, I recommend the following articles:
- The Diet We’re Meant to Eat, Part 1: Evolution & Hunter–Gatherers
- The Diet We’re Meant to Eat, Part 2: Physiological & Biological Evidence
- Hunter–gatherers enjoy long, healthy lives
- Eating meat led to smaller stomachs, bigger brains
The Biochemical Perspective
Understanding ancestral diets and their relationship to the health of hunter–gatherer populations is a good starting place, but on its own, it doesn’t prove that such diets are the best option for modern humans.
To know that, we need to examine this question from a biochemical perspective. We need to know what nutrients are essential to human health, where they are found in food, and how various components of the diet and compounds in foods affect our physiology—both positively and negatively.
The good news is, there are tens of thousands of studies in this category. Collectively, they bring us to the same conclusion we reached above:
Nutrient density is arguably the most important concept to understand when it comes to answering the question, “What should humans eat?”
The human body requires approximately 40 different micronutrients for normal metabolic function.
There are two types of nutrients in food: macronutrients and micronutrients. Macronutrients refer to the three food substances required in large amounts in the human diet, namely:
Micronutrients, on the other hand, are vitamins, minerals, and other compounds required by the body in small amounts for normal physiological function.
The term “nutrient density” refers to the concentration of micronutrients and amino acids, the building blocks of proteins, in a given food. While carbohydrates and fats are important, they can be provided by the body for a limited amount of time when dietary intake is insufficient (except for the essential omega-6 and omega-3 fatty acids). On the other hand, micronutrients and the essential amino acids found in protein cannot be manufactured by the body and must be consumed in the diet.
With this in mind, what are the most nutrient-dense foods? There are several studies that have attempted to answer this question. In the most comprehensive one, which I’ll call the Maillot study, researchers looked at seven major food groups and 25 subgroups, characterizing the nutrient density of these foods based on the presence of 23 qualifying nutrients. (13)
Maillot and colleagues found that the most nutrient-dense foods were (score in parentheses):
- Organ meats (754)
- Shellfish (643)
- Fatty fish (622)
- Lean fish (375)
- Vegetables (352)
- Eggs (212)
- Poultry (168)
- Legumes (156)
- Red meats (147)
- Milk (138)
- Fruits (134)
- Nuts (120)
As you can see, eight of the 12 most nutrient-dense categories of foods are animal foods. All types of meat and fish, vegetables, fruit, nuts, and dairy were more nutrient-dense than whole grains, which received a score of only 83. Meat and fish, veggies, and fruit were more nutrient dense than legumes, which were slightly more nutrient dense than dairy and nuts.
There are a few caveats to the Maillot analysis:
- It penalized foods for being high in saturated fat and calories
- It did not consider bioavailability
- It only considered essential nutrients
Caloric Density and Saturated Fat
In the conventional perspective, nutrient-dense foods are defined as those that are high in nutrients but relatively low in calories. However, recent evidence (which I’ll review below) has found that saturated fat doesn’t deserve its bad reputation and can be part of a healthy diet. Likewise, some foods that are high in calories (like red meat or full-fat dairy) are rich in key nutrients, and, again, can be beneficial when part of a whole-foods diet. Had saturated fat and calories not been penalized, foods like red meat, eggs, dairy products, and nuts and seeds would have appeared even higher on the list.
Bioavailability is a crucial factor that is rarely considered in studies on nutrient density. It refers to the portion of a nutrient that is absorbed in the digestive tract. The amount of bioavailable nutrients in a food is almost always lower than the amount of nutrients the food contains. For example, the bioavailability of calcium from spinach is only 5 percent. (14) Of the 115 mg of calcium present in a serving of spinach, only 6 mg is absorbed. This means you’d have to consume 16 cups of spinach to get the same amount of bioavailable calcium in one glass of milk!
The bioavailability of protein is another essential component of nutrient density. Researchers use a measure called the Protein Digestibility Corrected Amino Acid Score (PDCAAS), which combines the amino acid profile of a protein with a measure of how much of the protein is absorbed during digestion to assess protein bioavailability. The PDCAAS rates proteins on a scale of 0 to 1, with values close to 1 representing more complete and better-absorbed proteins than ones with lower scores.
On the scale, animal proteins have much higher scores than plant proteins; casein, egg, milk, whey, and chicken have scores of 1, indicating excellent amino acid profiles and high absorption, with turkey, fish, and beef close behind. Plant proteins, on the other hand, have much lower scores; legumes, on average, score around 0.70, rolled oats score 0.57, lentils and peanuts are 0.52, tree nuts are 0.42, and whole wheat is 0.42.
Thus, had bioavailability been considered in the Maillot study, animal foods would have scored even higher, and plant foods like legumes would have scored lower.
Essential vs. Nonessential Nutrients
The Maillot study—and a similar analysis from Harvard University chemist Dr. Mat LaLonde—only considered essential nutrients. In a nutritional context, the term “essential” doesn’t just mean “important,” it means necessary for life. We need to consume essential nutrients from the diet because our bodies can’t make them on their own.
Focusing on essential nutrients makes sense, since we can’t live without them. That said, over the past few decades many nonessential nutrients have been identified that are important to our health, even if they aren’t strictly essential. These include:
Many of these nonessential nutrients are found in fruits and vegetables. Had these nutrients been included in the nutrient density analyses, fruits and vegetables would likely have scored higher than they did.
What Can We Conclude from the Biochemical Perspective?
But how much of the diet should come from animals, and how much from plants? The answer to this question will vary based on individual needs. If we look at evolutionary history, we see that on average, humans obtained about 65 percent of calories from animal foods and 35 percent of calories from plant foods on average, but the specific ratios varied depending on geography and other factors.
That does not mean that two-thirds of what you put on your plate should be animal foods! Remember, calories are not the same as volume (what you put on your plate). Meat and animal products are much more calorie-dense than plant foods. One cup of broccoli contains just 30 calories, compared to 338 calories for a cup of beef steak.
This means that even if you’re aiming for 50 to 70 percent of calories from animal foods, plant foods will typically take up between two-thirds and three-quarters of the space on your plate.
(Side note: this is why I’ve always rejected the notion of Paleo as an “all-meat” diet; a more accurate descriptor would be a plant-based diet that also contains animal products).
When we consider the importance of both essential and nonessential nutrients, it also becomes clear that both plant and animal foods play an important role because they are rich in different nutrients. Dr. Sarah Ballantyne broke this down in part three of her series “The Diet We’re Meant to Eat: How Much Meat versus Veggies.”
- Vitamin C
- Carotenoids (lycopene, beta-carotene, lutein, zeaxanthin)
- Diallyl sulfide (from the allium class of vegetables)
- Flavonoids (anthocyanins, flavan-3-ols, flavonols, proanthocyanidins, procyanidins, kaempferol, myricetin, quercetin, flavanones)
- Plant sterols and stanols
- Isothiocyanates and indoles
- Prebiotic fibers (soluble and insoluble)
- Vitamin B12
- Heme iron
- Preformed vitamin A (retinol)
- High-quality protein
- Vitamin K2
- Vitamin D
- DHA (docosahexaenoic acid)
- EPA (eicosapentaenoic acid)
- CLA (conjugated linoleic acid)
For a deeper dive on these subjects, check out the following articles:
- What Is Nutrient Density and Why Is It Important?
- The Diet We’re Meant to Eat, Part 3: How Much Meat versus Veggies
Focus Your Diet on Nutrient Density
Whether we look through the lens of evolutionary biology and history or modern biochemistry, we arrive at the same conclusion:
Anthropology and archaeology suggest that it’s possible for humans to thrive on a variety of food combinations and macronutrient ratios within the basic template of whole, unprocessed animal and plant foods.
For example, the Tukisenta of Papua New Guinea consumed almost 97 percent of calories in the form of sweet potatoes, and the traditional Okinawans also had a very high intake of carbohydrate and low intake of animal protein and fat. On the other hand, cultures like the Maasai and Inuit consumed a much higher percentage of calories from animal protein and fat, especially at certain times of year.
How much animal vs. plant food you consume should depend on your specific preferences, needs, and goals. For most people, a middle ground is what appears to work best, with between 35 and 50 percent of calories from animal foods and between 50 and 65 percent of calories coming from plant foods. (Remember, we’re talking about calories, not volume.)
Now I’d like to hear from you. What is your “optimal human diet”? Have you experimented with different ratios of animal vs. plant foods? What works for you? Let me know in the comments section.