About 1.8 million years ago, evolution experienced a game-changer. Something happened to kick-start hominid adaptation and thrust it out of its arboreal, australopithicene heritage and headlong into the robust new genus Homo (that's "us," for those of you at home). It's one of the most abrupt shifts in body plan ever seen in the fossil record: away from a chimp-like "ape-man" to something with far bigger brains, radically smaller guts and mouths, longer legs, flatter feet and the ability to make tools.
The precise nature of that game-changer is among the most elusive prizes in all of paleontology and anthropology. So, it's surprising that for the last 50 years or so, there's really only been one hypothesis to explain it.
Most researchers agree that the key adaptive change was the pairing of larger brains with smaller guts and mouths, an idea known as the expensive tissue hypothesis. Because big brains have such high glucose demands, the body has to find a way to fuel its smarts without sacrificing too much from other vital systems. The digestive system, pathway for the brain's energy, would thus need to be more efficient the bigger the brain. The smaller mouths, teeth and guts of genus Homo compared to earlier hominids indicates that we adapted the ability to efficiently digest energy-dense foods in short amounts of time, freeing up more metabolic energy for brain development and maintenance.
For 50 years, anthropological and paleontological consensus has held that the gateway to this dramatic shift was an increase in meat-eating by our common ancestor, popularized as the Man-The-Hunter model. This model placed the mastery of fire at between 400 to 100 Ka, and assumed that before this time, our Homo ancestors ate all their food -- both animal and plant -- raw.
Problem was, that assumption carried with it a troubling conundrum. If genus Homo ate more meat than other hominids and ate most of its meat raw for most of its evolutionary history, how was it that our teeth, jaws and guts were so inefficient at chewing and digesting the raw meat of wild game (or, for that matter, the tough fibers of plant cellulose cells)? The question was usually settled with the assumption that we relied on soft meats -- organs, brains, viscera, and so on -- and only made a regular practice of focusing on muscle after the taming of fire.
This meme stuck with us for almost 50 years, most recently becoming the basis of "paleo-dieters" claims that humans are specifically adapted to meat-eating, that it's in our genes, so to speak. So powerful was this idea that no one seriously challenged it until now.
Richard Wrangham, Ruth Moore Professor of Biological Anthropology at Havard University, wants to change that. He first began to publicly question the Man-The-Hunter hypothesis about 10 years ago with a compelling counter-argument: that the game-changer in human evolution wasn't increased meat-eating but the invention of cooking; that the energy-dense food to which we adapted wasn't raw organ meat, but cooked foods of various kinds -- and mostly, at the beginning, cooked tubers, roots, corms and other underground storage organs of plants.
Wrangham has developed his hypothesis into a recent book -- Catching Fire: How Cooking Made Us Human -- that I finally got around to reading last night, during a rare break from studying. And I must say, it's a damned impressive and elegant argument.
Wrangham's hypothesis flies in the face of almost a half-century's worth of paleontological and anthropological doctrine, as it places the invention of cooking about 1 million years earlier than even the most generous previous hypotheses. As noted, most of the literature and research assumes the consumption of raw food prior to about 400 to 100 Ka. So, Wrangham's new hypothesis is daring, to say the least.
But challenging established consensus doesn't make a hypothesis wrong. Wrangham marshals some compelling logic to defend his case, and provides some persuasive physical evidence to back it up. He highlights the flaws of the Man-The-Hunter hypothesis, showing that many of its established findings were in fact merely untested assumptions.
Like any good scientist, Wrangham admits the evidence for his case, while persuasive, is not yet conclusive enough to warrant overturning the consensus. But it's provocative enough in both its predictive and explanatory power to warrant serious investigation, not least because it has the potential to settle a long-standing dispute about the exact role of meat-eating in human evolution.
And it's a solution that some raw vegans and raw "paleo-dieters" alike won't take much of a shine to.
Both of those camps assume that humans evolved to eat raw foods. They take this assumption (though often misunderstood and misapplied) directly from the scientific literature, so I don't hold it against them. But it has helped muddy the debate about "optimal" human diets and their associated ethics.
It turns out the answer to the question, "what is the most natural diet for humans?", may not be exclusively either meat or plants eaten raw, but whichever we prefer, so long as its cooked!
Which means, the decision for veganism over carnivory (or vice versa) remains what it's always been: an ethical choice, not a biological obligation.