Poor diet may have behavioral consequences

A lack of zinc in your diet may be making you aggressive.

(From the Geenpasture.org archive)

A fascinating article in Psychology Today focused on the origin of violent behavior. Was it nature or nutrition? Could the underlying cause be in one’s upbringing or in their genes? Or just maybe it could be some type of nutritional imbalance.

Taking the nutritional stance was William Walsh, Ph.D. and his team at the Health Research Institute in Illinois. Walsh and his colleagues published a study in Physiology & Behavior (1997) where they compared the results of blood tests given to 135 “assaultive” young males—who were between 3 and 20 years of age—to those of 18 in the control group without any history of violence. The results were staggering: The violent males had higher copper and lower zinc levels than the control group. The higher the copper and lower the zinc, the more aggressive and violent the behavior.

When the aggressive young males were treated with therapeutic doses of zinc, their aggressive episodes were substantially lessened.


[Read More Here]


Paleo is the key to heath, fitness, and looking good naked. And smarts! Part III

Reblogged from Gnolls.org

My takehome point from this article:

The result of OFT (optimal foraging theory) is, as one might hope, common sense: our ancestors would have eaten the richest, most accessible foods first.


Our Story So Far

  • It is not enough to state that the availability of high-quality food allowed our ancestors’ brains to increase in volume from ~400cc to ~1500cc between 2.6-3 MYA and 100-200 KYA. We must explain the selection pressuresthat caused our brains to more than triple in size—instead of simply allowing us to increase our population, or to become faster or stronger.
  • To gloss over this explanation is a teleological error. It assumes that evolution has a purpose, which is to create modern humans.
  • Climate change is most likely a factor—but it is insufficient, by itself, to create this selection pressure.
  • The Paleolithic is an age defined by the use of stone tools (“industries”) to assist in hunting and gathering. It began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry.
  • The Pleistocene began exactly 2.588 MYA and ended 11,700 BP, and is defined by the age of specific rock (or ice) formations.
  • Therefore, if we wish to locate an event precisely in time, we need to speak in terms of geological time—the Pliocene and Pleistocene epochs. If we wish to identify an event relative to human technological capability, we need to speak of cultural time—the Paleolithic age.
  • Sexual selection is a fascinating subject, but I see no need to invoke it to explain the increase in hominid brain size from the start of the Paleolithic to the rise of anatomically modern humans.

A Timeline Of Facts, A Narrative To Join Them

The factual knowledge we have about human behavior (including diet) during the Pleistocene is limited by the physical evidence we’ve discovered so far—which becomes thinner the farther back in time we go. Therefore, any narrative we construct from these facts must necessarily remain contingent on future discoveries.

However, the evidence we have strongly supports the currently accepted hypothesis for the evolution of human intelligence. I’ll do my best to compress several semesters of anthropology and evolutionary theory into a timeline that tells our ancestors’ story.

First, a key concept: in order to explain a more than tripling of brain size over nearly 3 million years, a single event is not sufficient. It’s not enough to say “Hunting is hard, so we had to get smarter.” We must postulate a sequence of events—one which creates the most parsimonious narrative from the physical evidence.

“Parsimonious” means “stingy” or “frugal”. It is frequently used by scientists as part of the phrase “the most parsimonious hypothesis/theory/explanation”, which means “the explanation which requires the least speculation and depends on the fewest unknowns.” (Also see: Occam’s razor.)

Before we start our narrative, we must define one more term: optimal foraging theory.

Optimal Foraging Theory

Optimal foraging theory (OFT) is a simple concept: “…Decisions are made such that the net rate of energy capture is maximized.” (Sheehan 2004)

This is because efficiency—obtaining more food for less effort—is rewarded by natural selection. Efficient foragers survive better during difficult times, and they spend less time exposed to the risks of foraging. This leaves them more likely to survive, and with more time to seek mates, raise offspring, or simply rest.

In the simplest case, herbivores select the most nutritious plants, and predators select the fattest, slowest herbivores. However, many complicated behaviors result from application of this simple rule. Two examples: for herbivores, leaving the herd costs energy and makes being eaten by a carnivore more likely; for predators, unsuccessful hunts cost energy and make starvation more likely.

Due to time and space constraints, we’re barely scratching the surface of OFT. This article provides a brief introduction, and Wikipedia goes into more detail—including many refinements to the basic model. For an in-depth exploration, including several interesting and complex behaviors resulting entirely from its real-world application, read this textbook chapter (PDF).

The result of OFT is, as one might hope, common sense: our ancestors would have eaten the richest, most accessible foods first.

Our Story Begins On Two Legs: Ardipithecus ramidus

Our story begins in an African forest during the Pliocene epoch, 4.4 million years ago. (Our ancestors have already parted ways with the ancestors of chimpanzees and bonobos. This occurred perhaps 6.5 MYA, in the late Miocene.)

The Miocene epoch lasted from 23 MYA to 5.3 MYA. The Pliocene epoch lasted from 5.33 to 2.59 MYA, and the Pleistocene lasted from 2.59 MYA to 11,700 BP.

It’s important to note that many different hominins existed throughout the Pliocene and Pleistocene. We aren’t absolutely certain which were directly ancestral to modern humans, and which represented stem species that subsequently died out…but the fossil record is complete enough that we’re unlikely to dig up anything which radically changes this narrative.

Though there are fascinating fossil finds which date even earlier (e.g. Orrorin), we’ll begin with Ardipithecus ramidus, a resident of what is now Ethiopia in the mid-Pliocene, 4.4 MYA. Today it’s the Afar desert—but in the Pliocene, its habitat was a lush woodland which occasionally flooded.

What Ardipithecus ramidus might have looked like. Click the picture for a BBC article.

“Ardi” was about four feet tall, with a brain the size of a modern chimpanzee (300-350cc). She was most likely what we call a facultative biped, meaning that she walked on four legs while in trees, and on two legs while on the ground: though her pelvis was adapted to walking upright, her big toe was still opposable and she had no arches, leaving her feet better adapted to gripping trees than to walking or running.

You can learn much more about Ardi at Discovery.com’s extensive and informative (though Flash-heavy and somewhat hyperbolic) website. For those with less patience or slow Internet connections, this NatGeo article contains a discussion of Ardi’s importance and possible means of locomotion. (Warning: both contain some highly speculative evolutionary psychology.)

From the evidence, we know that there must have been selection pressure to sacrifice tree-climbing ability in exchange for improved bipedal locomotion—most likely due to an increased ability to take advantage of ground-based foods. Though evidence is thin, its discoverers think (based on its teeth) that Ardi consumed a similar diet to its successor Australopithecus anamensis—nuts, root vegetables, insects, mushrooms, and some meat. (This supports the idea that Ardi ate more ground-based food, such as root vegetables and mushrooms, and less tree-based food, such as fruit.) And stable isotope analysis of its tooth enamel confirms that Ardipithecus was a forest species, lacking significant dietary input from grasses or animals that ate grasses.

Fruit Is For The Birds (And The Bats, And The Chimps): Australopithecus anamensis

Our next data point comes just a few hundred thousand years later.

“Early Humans Skipped Fruit, Went for Nuts”
Discovery News, November 9, 2009

Macho and colleague Daisuke Shimizu analyzed the teeth of Australopithecus anamensis, a hominid that lived in Africa 4.2 to 3.9 million years ago.

Based on actual tooth finds, Shimizu produced sophisticated computer models showing multiple external and internal details of the teeth. One determination was immediately clear: Unlike chimpanzees, which are fruit specialists, the hominid couldn’t have been much of a fruit-lover.

“Soft fleshy fruits tend to be acidic and do not require high bite forces to be broken down,” explained Macho. “The enamel microstructure of A. anamensis indicates that their teeth were not well equipped to cope with acid erosion, but were well adapted to masticate an abrasive and hard diet.”

The researchers therefore believe this early human ate nuts, root vegetables, insects—such as termites—and some meat. While they think certain flowering plants known as sedges might have been in the diet, Lucy and her relatives were not properly equipped for frequent leaf-chewing.

(Hat tip to Asclepius for the reference.)

Here’s the original paper:

Journal of Human Evolution Volume 57, Issue 3, September 2009, Pages 241–247
Dietary adaptations of South African australopiths: inference from enamel prism attitude
Gabriele A. Macho, Daisuke Shimizu

Unfortunately, as all we have yet found of Australopithecus anamensis are pieces of a jawbone and teeth, a fragment of a humerus, and a partial tibia (and those not even from the same individual!) we don’t know its cranial capacity. We do know that its range overlapped that of Ardipithecus—but since remains have also been found in transitional environments, it may have not been a pure forest-dweller.

Either way, it appears that our ancestors had been selected away from a fruit-based diet, and towards an omnivorous diet more compatible with savanna-dwelling, even before they left the forest.

Our Story Continues…With Footprints

This brings us to an unusual fossil find…the Laetoli footprints, left in volcanic ash 3.7 MYA, cemented by rainfall, and preserved by subsequent ashfalls. Their form and spacing shows that the hominins who made them were fully bipedal: their feet had arches and an adducted big toe, and they walked at or near human walking speed.

A footprint at LaetoliExcavating the Laetoli footprints

“Adducted” means “closer to the midline”. It means their big toe was close to their other toes, like a modern human—quite unlike the widely spaced, opposable big toe of Ardipithecus.

And though we’re not completely sure, it is generally accepted that the footprints were made by Australopithecus afarensis, the next player in our story. Here’s the original paper by Leakey and Hay, for those interested:

Nature Vol. 278, 22 March 1979, pp. 317-323
Pliocene footprints in the Laetolil Beds at Laetoli, northern Tanzania
Leakey, M. D. and Hay, R. L.

In summary, it’s clear from what we know of Ardipithecus, and Australopithecus anamensis, that bipedalism long preceded our ancestors’ move into savanna and grassland habitats. This makes sense: a clumsily-waddling knuckle-walker would stand no chance outside the safety of the forest, whereas a bipedal ape can survive in the forest so long as it retains some ability to climb trees—a talent even humans haven’t completely lost.

Furthermore, our dietary shift towards ground-based foods, and away from fruit, also preceded our ancestors’ move into savanna and grassland habitats.

Finally, and most importantly, both of these changes preceded the massive increase in our ancestors’ brain size.

The story of our origins will continue next week!

Live in freedom, live in beauty.


Paleo is the key to health, fitness, and looking good naked. And smarts! Part II.

Reblogged from Gnolls.org

Let’s Get Oriented In Time: What Does “Paleolithic” Mean?

Since we’ve talking about the “paleo diet” for years, and this series explores the increased brain size and behavioral complexity that took place during the Paleolithic, I think it’s important to understand exactly what the term “Paleolithic” means. Yes, everyone knows that it happened a long time ago—but how long? And how is the Paleolithic different from the Pleistocene? What do all these terms mean, anyway?

First, Some Common Archaeology Terms And Abbreviations

BP = years Before Present. “The artifact was dated to 6200 BP.”
KYA (or ka) = thousands of years Before Present. “The bones were dated to 70 KYA.”
MYA (or ma) = millions of years Before Present. “The Permo-Triassic extinction occurred 250 MYA.”
industry = a technique that produced distinct and consistent tools throughout a span of archaeological time. Examples: the Acheulean industry, the Mousterian industry.


Oldowan choppersThey don’t look like much—but they were much better than fingernails or teeth at scraping meat off of bones.

The word itself is a straightforward derivation from Greek. “Paleo-” means “ancient”, and “-lithic” means “of or relating to stone”, so “Paleolithic” is just a sophisticated way to say “old rocks”. Its beginning is defined by the first stone tools known to be made by hominids, dated to approximately 2.6 MYA—the Oldowan industry—and it ends between 20,000 and 5,000 BP, with technology generally agreed to be transitional towards agriculture (the “Mesolithic” industries).


The Paleolithic age is further divided:

  • Lower Paleolithic: 2.6 MYA – 300 KYA. Defined by the Oldowan and Acheulean industries.
  • Middle Paleolithic: 300 KYA – 30 KYA. Defined primarily by the Mousterian and Aterian industries.
  • Upper Paleolithic: 50 KYA – between 20 and 5 KYA. Defined by a host of complex industries.(Click here for more information, including links to all the above terms.)

The reason for the imprecise ending of the Upper Paleolithic (and the overlap between Paleolithic stages) is not because there is doubt about the dates of such recent artifacts…it is because the Paleolithic is a technological boundary, not a temporal boundary, and is defined by the suite of tools in use. So for the first cultures to transition towards agriculture, the Paleolithic ended approximately 20 KYA (and was succeeded by the Mesolithic), whereas other cultures used Paleolithic technology until perhaps 5000 BP.

It’s also important to keep in mind that there are continuing definitional squabbles, particularly with the Mesolithic and Neolithic. What constitutes a Mesolithic culture vs. an Epipaleolithic culture? If a culture never takes up farming, is it still Neolithic if it uses similar tools and technology?

I don’t like to spend too much time in this morass, because it’s not an interesting argument—it’s just a failure to agree on definitions. However, it is always true that Paleolithic cultures were hunter-gatherers. Furthermore, it is almost always true that Neolithic cultures were farmers. (There are a few cases where nomadic cultures adopted Neolithic technology, such as pottery.)

So when we are speaking of a “Paleolithic diet”, we are speaking of a diet nutritionally analogous to the diet we ate during the Paleolithic age—the age during which selection pressure caused our ancestors to evolve from 3’6″, 65# australopithecines with 400cc brains into tall, gracile, big-brained, anatomically modern humans with 1400cc brains. (A figure which has decreased by roughly 10% during the last 5000 years.)

No, we can’t just ‘eat like a caveman’: the animals are mostly extinct and the plants have been bred into different forms. I discuss the issue at length in this article: The Paleo Identity Crisis: What Is The Paleo Diet, Anyway?


Now Let’s Orient Ourselves In Geological Time

In contrast to archaeological ages, the Pleistocene is a geological term (an “epoch”), defined precisely in time as beginning 2.588 MYA and ending 11,700 BP. It’s preceded by the Pliocene epoch (5.332 to 2.588 MYA) and followed by the Holocene epoch (11,700 BP – present).

You’ll see a lot of sources that claim the Pleistocene began 1.6 or 1.8 MYA. This is because the definition was changed in 2009 to its present date of 2.588 MYA, so as to include all of the glaciations to which I referred in Part I.

(More specifically, geological time divisions are defined by a “type section”, which is a specific place in a specific rock formation, and which is dated as precisely as possible given available technology.)

Remember, these are all just names…changing the name doesn’t alter the events of the past.

To give some idea of the time scales involved, our last common ancestor with chimps and bonobos lived perhaps 6.5 MYA, the dinosaurs died out 65.5 MYA, and Pangaea broke up 200 MYA.

Note that the middle timeline of the illustration below zooms in on the end of the top timeline, and the bottom timeline zooms in on the end of the middle timeline. Also note that the time period we’re exploring takes up one tiny box in the lower right, so small that the word “Pleistocene” doesn’t even fit inside it!

Geological timeline of the Earth, from The EconomistClick the image for a larger and more legible version, and an interesting article from The Economist.

For a slightly deeper look into the significance of each geological period, I highly recommend you click here for a graphical, interactive timeline. And here’s a long explanation of the terminology: ages, epochs, eons, and so on.

Summary: Paleolithic or Pleistocene?

The Paleolithic began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry. Since it’s defined by our knowledge of hominid tool use, these dates could change in the future.

The Pleistocene began exactly 2.588 MYA and ended 11,700 BP. These dates are defined by our best estimates of the age of two specific pieces of rock (or ice) somewhere on the Earth.

So though the two terms are measuring nearly identical spans of time, they’re defined by two completely different phenomena…and since we’re speaking of human development, it is appropriate to use the term defined by human artifacts—the Paleolithic age.

Did Sexual Selection Drive The Australopithecus -> Homo Transition?

Evolutionary psychology is great fun to read about…but the problem with extrapolating it back into the Lower and Middle Paleolithic is that it’s pure speculation. The entire fossil record of this era of hominids can be itemized on one Wikipedia page, and I think it’s extremely risky to draw behavioral conclusions so far beyond the physical evidence.

More importantly, though, it’s unnecessary to invoke sexual selection in order to explain the growth in human brain size.

“Even if the survivalist theory could take us from the world of natural history to our capacities for invention, commerce, and knowledge, it cannot account for the more ornamental and enjoyable aspects of human culture: art, music, sports, drama, comedy, and political ideals.”
-Geoffrey Miller, “The Mating Mind”

While this may very well be true, the first known archaeological evidence of art (blocks of ocher engraved with abstract designs) is dated to just 75,000 years ago, at Blombos Cave in South Africa—long after our ancestors first became anatomically modern c. 200,000 years ago. (Venus figurines are much more recent: the earliest is dated to 35 KYA.)


The first known art: carved red ocherClick the image for more information about Blombos Cave.



The term “anatomically modern humans” refers to ancestral humans whose remains fall within the range of variations exhibited by humans today. We refer to such humans as the subspecies Homo sapiens sapiens.

Note that as with all fossil classifications, “anatomically modern” is a judgment call. There was no instant transition: a beetle-browed, heavy-limbed, archaic Homo sapien did not suddenly gave birth to Salma Hayek, and there are indeed many transitional fossils with a mix of archaic and modern features, usually known as “Early Modern Humans”.

Furthermore, the behavior of the few remaining African hunter-gatherer tribes, such as the Hadza and the Ju/wasi, supports the interpretation that sexual selection simply reinforced the same selection pressures as natural selection:

Human Nature 15:364-375.
Mate Preferences Among Hadza Hunter-Gatherers
Frank W. Marlowe

“Women placed more value on men being good foragers (85% of those women said “good hunter”) than on any other trait.”

National Geographic, December 2009
“The Hadza”
Michael Finkel

“Onwas joked to me that a Hadza man cannot marry until he has killed five baboons. […] Ngaola is quiet and introspective and a really poor hunter. He’s about 30 years old and still unmarried; bedeviled, perhaps, by the five-­baboon rule.

The Old Way: A Story Of The First People
Elizabeth Marshall Thomas

“A young man may not marry until he has killed a big game animal (preferably a large antelope, although a duiker or a steenbok will also suffice) and proved himself a hunter.”
“His [/Gunda’s] victim had been only a duiker, but a duiker is plenty big enough to qualify a boy for marriage.
“He [≠Toma] had few living relatives and no close ones, and thus could offer her no in-laws who could help her if the need arose, but he was an excellent hunter. This would appeal to any girl. So !U nagged her parents until they consented to the marriage.

In conclusion: the evidence is that sexual selection, if it was an important force, was providing the same selection pressure as natural selection—and that the behaviors most attributed to sexual selection postdate our evolutionary transformation into anatomically modern humans. Furthermore, it seems prudent not to invoke a factor for which our evidence is entirely speculative when there are other factors sufficient to explain our ancestors’ transformation.

Therefore, while sexual selection is a fascinating subject worthy of discussion, I don’t see a need to invoke it as a separate force to explain the increase in hominid brain size and behavioral complexity from the beginning of the Paleolithic (2.6 MYA) to the time of anatomically modern humans (200-100 KYA).

Live in freedom, live in beauty.


Paleo is the key to health, fitness, and looking good naked. And smarts!

I’m going to go ahead and just re-blog this link I used in last weeks Missing Link(s), because I think it’s vitally important to understand our evolution and how we became what we are today.

It’s also important to understand why I promote Paleo lifestyle so passionately: It leads to vibrant health that follows through to old age. It prevents disease and poor gene expression. It prevents sickness and injury. And it makes you look and feel like a million bucks! I believe pretty much everyone should at least try Paleo, all out, 100% for 30 days AT LEAST. See if you don’t agree with me. Because this is what got us here. It’s what makes us thrive!

Keep in mind, this is the first part to a series. More details to come. Anyway, enough with the introduction…

Big brains require an explanation. How did humans become smarter, not just more numerous?

How did we get from this:

Australopithecus afarensis reconstructionAustralopithecus afarensis (reconstruction)


To both this…

Hadzabe hunting Maribou storks on the shore of Lake Eyasi, Tanzania.Hadzabe hunting Marabou storks

And this?

Shibuya Crossing 163Shibuya Crossing, Tokyo

That’s more than a tripling of brain size—and an astounding increase in cultural complexity—in under 3 million years.

I’ve previously written about the currently accepted explanation, in this article: “Why Humans Crave Fat.” Here are a few bullet points:

  • Chimpanzees consume about one McDonalds hamburger worth of meat each day during the dry season—mostly from colobus monkeys, which they hunt with great excitement and relish.
  • Kleiber’s Law states that all animals of similar body mass have similar metabolic rates, and that this rate scales at only the 3/4 power of size. Therefore, in order for our brains to grow and use more energy, something else had to shrink and use less energy.
  • It takes a much larger gut, and much more energy, to digest plant matter than it does to digest meat and fat.This is why herbivores have large, complicated guts with extra chambers (e.g. the rumen and abomasum), and carnivores have smaller, shorter, less complicated guts.
  • The caloric and nutritional density of meat allowed our mostly-frugivorous guts to shrink so that our brains could expand—and our larger brains allowed us to become better at hunting, scavenging, and making tools to help us hunt and scavenge. This positive feedback loop allowed our brains to grow from perhaps 400cc (“Lucy”, Australopithecus afarensis) to over 1500cc (late Pleistocene hunters).
  • In support of this theory, the brains of modern humans, eating a grain-based agricultural diet, have shrunk by 10% or more as compared to late Pleistocene hunters and fishers.

(For a more detailed explanation, including links, references, and illustrations, read the original article.)

The Teleological Error

When discussing human evolution, it’s easy to fall into the error of teleology—the idea that evolution has a purpose, of which intelligence (specifically, self-conscious intelligence recognizable to our modern philosophical traditions, and producing something recognizable to us as ‘civilization’) is the inevitable expression and end result.

Geology and archaeology proves this is not so. For instance, 140 million years of saurian dominance (far more than the 65 million years mammals have so far enjoyed) apparently failed to produce any dinosaur civilizations: they simply became bigger, faster, and meaner until the K-T asteroid hit.

Thus endeth the reign of the dinosaurs.Thus endeth the reign of the dinosaurs.


Likewise, the increased availability of rich, fatty, nutrient- and calorie-dense meat (enabled in large part by the usage of stone tools to deflesh bones, first practiced by our ancestors at least 2.6 million year ago, or MYA) does not, by itself, explain the over threefold increase in human brain size which began with the Pleistocene era, 2.6 MYA. When a climate shift brings more rain and higher, lusher grass to the African savanna, we don’t get smarter wildebeest, or even larger wildebeest. We get more wildebeest. Neither does this increase in the prey population seem to produce smarter hyenas and lions…it produces more hyenas and lions.

Contrary to their reputation, spotted hyenas are excellent hunters, and kill more of their own prey than lions do. (Many “lion kills” were actually killed by hyenas during the night—whereupon the lions steal the kill, gorge themselves, and daybreak finds the hyenas “scavenging” the carcass they killed themselves.) One 140-pound hyena is quite capable of taking down a wildebeest by itself.

So: if the ability to deflesh bones with stone tools allowed australopithecines to obtain more food, why didn’t that simply result in an increase in the Australopithecus population? Why would our ancestors have become smarter, instead of just more numerous?

The answer, of course, lies in natural selection.

Natural Selection Requires Selection Pressure

I don’t like the phrase “survival of the fittest”, because it implies some sort of independent judging. (“Congratulations, you’re the fittest of your generation! Please accept this medal from the Darwinian Enforcement Society.”)

“Natural selection” is a more useful and accurate term, because it makes no explicit judgment of how the selection occurs, or what characteristics are selected for. Some animals live, some animals die…and of those that live, some produce more offspring than others. This is a simple description of reality: it doesn’t require anyone to provide direction or purpose, nor to judge what constitutes “fitness”.

“Natural selection” still implies some sort of active agency performing the selection (I picture a giant Mother Nature squashing the slow and stupid with her thumb)—but it’s very difficult to completely avoid intentional language when discussing natural phenomena, because otherwise we’re forced into into clumsy circumlocutions and continual use of the passive voice.

(And yes, natural selection operates on plants, bacteria, and Archaea as well as on animals…it’s just clumsy to enumerate all the categories each time.)

Finally, I’m roughly equating brain size with intelligence throughout this article. This is a meaningless comparison across species, and not very meaningful for comparing individuals at a single point in time…but as behavioral complexity seems to correlate well with brain size for our ancestors throughout the Pleistocene, we can infer a meaningful relationship.

Therefore, we can see that “The availability of calorie- and nutrient-rich meat allowed our ancestors’ brains to increase in size” is not the entire story. The additional calories and nutrients could just as well have allowed us to become faster, stronger, or more numerous. For our ancestors’ brain size to increase, there must have been positive selection pressure for big brains, because big brains are metabolically expensive.

While at rest, our brains use roughly 20% of the energy required by our entire body!

In other words, the hominids with smaller brains were more likely to die, or to not leave descendants, than the hominids with larger brains.

What could have caused this selection pressure?

Ratcheting Up Selection Pressure: Climate Change and Prey Extinction

Just as “natural selection” is simply a description of reality, “selection pressure” is also a description of reality. It’s the combination of constraints that cause natural selection—by which some animals live, some die, and some reproduce more often and more successfully than others.

The selection pressure applied by one’s own species to reproductive choices—usually mate choice by females—is often called “sexual selection.” Sexual selection is, strictly speaking, part of natural selection, but it’s frequently discussed on its own because it’s so interesting and complex.

In this essay, I’m speaking primarily of the non-sexual selection parts of natural selection, for two reasons. First, because this article would expand to an unreadable size, and second, because understanding the influence of sexual selection in the Pleistocene would require an observational knowledge of behavior. Lacking time machines, anything we write is necessarily speculation.

In order for selection pressure to change, the environment of a species must change. I believe there are two strong candidate forces that would have selected for intelligence during the Pleistocene: climate change and prey extinction.

The Incredible Oscillating Polar Ice Caps: Understanding Pleistocene Climate

I’ve discussed Pleistocene climate change at length before. (Note: the Pleistocene epoch began approximately 2.6 MYa.)

“Unlike the long and consistently warm eons of the Jurassic and Cretaceous (and the Paleocene/Eocene), the Pleistocene was defined by massive climactic fluctuations, with repeated cyclic “ice ages” that pushed glaciers all the way into southern Illinois and caused sea level to rise and fall by over 100 meters, exposing and hiding several important bridges between major land masses.” –“How Glaciers Might Have Made Us Human”

Here is a chart of the estimated average surface temperature of the Earth, starting 500 MYA and ending today. Note the logarithmic time scale!

Click image for larger version.

To appreciate the magnitude and severity of Pleistocene climactic oscillation, note the tiny dip in temperature towards the right labeled “Little Ice Age”. This minor shift froze over the Baltic Sea and the Thames River, caused Swiss villages to be destroyed by glaciers, wiped out the Greenland Norse colonies, and caused famines in Europe which killed from 10% to 33% of the population, depending on the country.

Furthermore, the climate was changing very quickly by geological standards. Let’s zoom in on the Quaternary period (2.6 MYA – present), of which the Pleistocene forms the overwhelming majority (up to 11,800 years ago):

5 million years of temperature estimates from ice cores.  Cool!Click image for larger version.

Note that massive 41,000 year climactic oscillations, each far greater than the Little Ice Age, began approximately 2.7 MYA—and the first known stone tools made by hominids (the Oldowan industry) are dated to 2.6 MYA.

Coincidence? Perhaps not.

Genetic Vs. Cultural Change

The behavior of most animals (and all plants) is primarily determined by genetic factors (“instinct”, “innate behavior”)—so in order to adapt to a changing environment, selection pressure must be exerted over many generations. For a short-lived species which reproduces a new generation ever year, or every few years, it might be possible to adapt to a 41,000 year climate cycle via natural selection.

However, for a long-lived species like humans, with generations measured in decades, genetic change is most likely too slow to fully adapt. We would have had to move in search of conditions that remained as we were adapted to…

…or we would have had to alter our behavior in cultural time, not genetic time.

Culture is the ability to transfer knowledge between generations, without waiting for natural selection to kill off those unable to adapt—and it requires both general-purpose intelligence and the ability to learn and teach. While space does not permit a full discussion of these issues, I recommend the PBS documentary “Ape Genius” for an entertaining look at the differences between modern human and modern chimpanzee intelligence and learning. (And I can’t resist noting that spotted hyenas outperform chimpanzees on intelligence tests that require cooperation: more information here and here, abstract of original paper here.)

You can watch the full video of “Ape Genius” here if you are a US resident. (If not, you’ll have to find a US-based proxy server.)

However, climate change is insufficient by itself to cause the required selection pressure. The overwhelming majority of known species survived these changes—including the glacial cycles of the past 740,000 years which scoured North America down to southern Illinois on eight separate occasions—because they could approximate their usual habitat by moving. Even plants can usually disperse their seeds over enough distance to keep ahead of glaciers.

Therefore, to fully explain the selection pressures that led to modern intelligence, we must look farther…to the consequences of intelligence itself.

Look for Part 2—in which we’ll explore the relevance of all this to modern diet, nutrition, and far more—next week.

Live in freedom, live in beauty.



Easy Steps to Control Your Genes. Don’t be a victim of your genes!

It seems that most average Americans have a lot of bad inflammation, hormones and genes floating around right now, causing all kinds of various problems that we are programmed to just accept as “facts fo life”.

Really, we are in control of most of these problems, once you realize how interconnected each bodily system is, it gets pretty simple to fix!

We can influence gene expression to a far greater degree than previously thought possible.

I’m going to utilize a few quotes from one of my favorite bloggers, Mark Sisson, to introduce this concept:

“The take home message here is that you can literally reprogram your genes to live a long, healthy, productive, happy and energetic life. You can either sit idly by and end up a victim of poor gene expression, or you can take control of the signals you send your body (through diet, movement, stress management and many other lifestyle behaviors) and become the best version of you possible. “

“while your genes are “fixed”, the expression of those genes – the amount of proteins they cause to be made, whether or not they are even switched on or off at all – depends on the “environment,” the circumstances surrounding those genes. Diet, exercise, exposure to toxic chemicals (or fresh air), medicines, even the thoughts you think (which generate actual chemical signals) all influence gene expression – positively and/or negatively, depending on the choice.”

Basically it comes down to hormones and hormone expression. Hormones are fairly easy to manipulate, and hormones control the “on-off” switching of genes For example, when you eat sugar, the hormone insulin is secreted, and over time gene expression moves in a direction that produces more insulin. A diet high in sugar tends to cause your system to secrete more insulin, leading to down regulation of insulin receptors, which down regulates lipase and other fat-burning enzymes, which in turn increases the production of pro-inflammatory cytokines.
When you change to a diet low in sugars and rich in healthy fats, those or other genes are directed to reduce inflammatory expression, down-regulate insulin-producing metabolic machinery, up-regulate insulin receptors and rebuild cell membranes to reflect the presence of better building materials. Research in gene expression is exploding right now and is examining both the impact of environmental factors and the promise of epigenetic therapies. The connection between insulin resistance and genetic expression (particularly in relation to exercise) was raised in last week’s comments. Diet and toxin exposure have been shown to influence gene expression in laboratory studies. Here are a few study abstracts to pique your interest: PubMed 1, 2, 3.
We know that genes are controlled by hormones. We also know that poor hormone expression tends to perpetrate poor gene expression. And we know that positive hormone expression leads to good gene expression. It simply comes down to those parts of our environment we have the most control over: what we put into our bodies, how we deal with stress, and how we move around. Once we get that part right, most everything falls into place!
To spark your interest, Here’s some of the latest research into gene expression:
  • Researchers recently compared intestinal gene expression in breastfed and formula fed infants. The intestinal tract acts as a primary site for immune response, particularly in infants whose bodies must quickly learn to adapt to foreign foods outside the sterile womb environment. Glitches in intestinal (and related immune) development can cause food allergies, asthma and inflammatory bowel disease. Of particular note, gene expression that regulated cellular response to oxygen deprivation was more pronounced in breastfed babies, suggesting a possible cause for why breastfed infants have a lower SIDS risk.
  • Prenatal exposure to common environmental toxins can induce epigenetic changes that put a child at more risk for later cancer than post-birth exposure does. The study focused particularly on polycyclic aromatic hydrocarbons (PAHs), which are associated with oil and coal burning.

The take home message here is that you can literally reprogram your genes to live a long, healthy, productive, happy and energetic life. You can either sit idly by and end up a victim of poor gene expression, or you can take control of the signals you send your body (through diet, movement, stress management and many other lifestyle behaviors) and become the best version of you possible. [I put this quote in twice on purpose]