Low-income women/Texas pulls planned parenthood/abortion rights/evolutionary thoughts

Leave it to Texas to prove its manliness by taking away some women’s rights.

From the Burlington Free Press:

“Texas officials are vowing to cut off funding for Planned Parenthood after a federal court sided with the state in a challenge over a new law that bans clinics affiliated with abortion providers from getting money through a health program for low-income women.”

From NYmag.com:

“Texas’ authority to directly regulate the content of its own program necessarily includes the power to limit the identifying marks that program grantees are authorized to use. Identifying marks represent messages,” the judges wrote. “If the organizations participating in the WHP are authorized to use marks associated with the pro-abortion point of view — like the Planned Parenthood mark — Texas’ choice to disfavor abortion is eviscerated, just as it would be if the organizations promoted abortion through pamphlets or video presentations.”

Now apart from the abortion issue itself, it seems that this specifically targets low-income women, AND is an idiotic move by Texas from a financial standpoint.

The normal cost of a first-trimester abortion runs between $350 and $550, depending on subsidies, the method used, and other variables such as cost of living. That’s without insurance paying for any of it.

If a low-income woman needs welfare, cash AND food stamps benefits (this info may be outdated by a year or two), a family of four gets about $930./month. HANDED TO THEM. Multiply that by 18 years, and you’ve spent over 16k. Additional children are also a greater burden for low-income families. It’s also not the best way to raise a child. They’re more likely to be disadvantaged.

Long term, Texas is being stupid/bull-headed/ignorant. Short term, they’re a bunch of dudes who don’t believe women should have any rights.

Now, into the abortion issue.

We’re animals, through-and-through. The only thing separating us from the animals in the wild is our ability to reason.

Infanticide has been found in many species, including humans and other primates, cats, dogs, whales, rodents, insects and fish. http://www.ratbehavior.org/infanticide.htm gives the following reasons for infanticide: to gain food; to gain increased access to physical resources like food, nesting sites or space; to avoid caring for unrelated offspring; to bias the sex ratio of the litter. Adult males may kill a female’s young to increase his chances of mating. Infanticide may also be due to aggression or to disturbances in the physical or social environment. For example when female voles, mink and other mammals are in a state of psychological stress, they may eat their young.

It’s pretty much natural and normal to kill your offspring if you can’t properly care for it, or if it would excessively burden the quality of life of the “tribe”. It’s definitely your right to do with your body what you chose. I know I’m opening a can of worms here, but to deny our nature and where we came from, to ignore the evolutionary perspective, is pretty much to deny the fact that you are human.

If you want to get into our rights to kill, you’ll have to dig into whether or not it’s our right to kill other animals, for sport or any other reason, including preparing land for farming, building things, etc.

We kill to eat. We kill to survive, to sustain our quality of living. It’s natural. Meat: it’s what gave us big brains, and gave us the ability to reason at all. Killing made us Homo sapiens.

Is it any different to kill a baby cow than it is to kill a baby human? At what point can you say “this animal has a certain cognitive ability, therefore we ought not to kill it”? Does that mean dolphins or chimps are off-limits? Does their cognitive ability measure up close enough to ours where we can impose laws on them, to prevent them from killing their offspring? Where do you draw the line? Why are we different?

Related to and in support of this topic: Ethics of meat eating.

At what point can it be said that behavior wholly subsumed in the nature of a species of animal can be wrong, unethical to practice? To even ask the question requires an introspective, intelligent conscience—the qualitative aspect of our being that differentiates us from other animals. Because otherwise, the question we’re asking demands first that we identify and explain how ethics could arise external to our own natural experience, from some super-existent realm sporting an external authority that trumps our own individual authority over our own behavior. In simpler terms: we are ethical beings. Ethics, a sense of right and wrong, is as much a part of what makes us human as the consumption of other animals along the way made us human. It’s all baked into the cake: meat gave us the nutritional density to evolve big brains, big brains gave us the intelligence to introspect, and conscious introspection gave us ethics. Eating meat made us ethical beings. As such, eating the flesh of non-ethical beings can’t logically be unethical.

Read the whole essay [here]

So have at it in the comments section…

Advertisements

This is a must-read

Rapid human evolution in the past 10,000 years.

[I think it’s clear at this point that modern Europeans, and many other populations with long-term ancestral Neolithic diets, carry meaningful genetic adaptations to the Neolithic diet.  However, there’s a major caveat here.  The presence of adaptation does not imply that we’re completely adapted to the Neolithic diet– we may only be partially there…  ]

Read the rest

The most important guest post of all: Why it’s inherently ethical for humans to be able to eat meat.

At what point can it be said that behavior wholly subsumed in the nature of a species of animal can be wrong, unethical to practice? To even ask the question requires an introspective, intelligent conscience—the qualitative aspect of our being that differentiates us from other animals. Because otherwise, the question we’re asking demands first that we identify and explain how ethics could arise external to our own natural experience, from some super-existent realm sporting an external authority that trumps our own individual authority over our own behavior. In simpler terms: we are ethical beings. Ethics, a sense of right and wrong, is as much a part of what makes us human as the consumption of other animals along the way made us human. It’s all baked into the cake: meat gave us the nutritional density to evolve big brains, big brains gave us the intelligence to introspect, and conscious introspection gave us ethics. Eating meat made us ethical beings. As such, eating the flesh of non-ethical beings can’t logically be unethical.

Read the whole essay [here]

Hunter-Gatherer origins of hook-up culture vs. modern westernized social standards

One of the oldest hunter-gatherer societies still in existence, the !Kung, provides enlightening views on ancestral human sexual selection.

Mankind has spent 99% of his existence living the life of a hunter gatherer, therefore, by getting a glimpse into the thought processes of Nisa [main subject of this book] we simultaneously shed light on who we were at the beginning of time and how little we’ve changed despite this brief appearance of the modern conveniences of civilization.

The often repeated theorem by evolutionary biologists that we could not have possibly populated this planet by starting off as faithful monagamous pair bonds is brought into clear view by Nisa’s revelations. The sexual strategies employed by our highly social ancestors were the result of hundreds of thousands of years of refinement via sexual selection. The prevalence of bawdy sexual behavior and a prevailing hookup culture on many college campuses attests to the fact that despite our western conveniences, our westernized religions, and our PC indocrination, we have changed vey little if any since our emergence 100,000 years ago.

We see these two relationship phases arise within the hunter-gatherer society:

In psychoanalytic literature, a Madonna–whore complex is the inability to maintain sexual arousal within a committed, loving relationship.[1] First identified by Sigmund Freud, this psychological complex is said to develop in men who see women as either saintly Madonnas or debased prostitutes. Men with this complex desire a sexual partner who has been degraded (the whore) while they cannot desire the respected partner (the Madonna).[2] Freud wrote: “Where such men love they have no desire and where they desire they cannot love.”[3] Clinical psychologist Uwe Hartmann, writing in 2009, stated that the complex “is still highly prevalent in today’s patients”.[2]

The view of women as either Madonnas or whores limits women’s sexual expression, offering two mutually exclusive ways to construct a sexual identity.[4] The duality implies that women must assume subservient roles, either as madonnas to be protected or as whores to be punished by men.[5]

The original experiments with rats applied the following protocol:[6] A male rat was placed into an enclosed large box with four or five female rats in heat. He immediately began to mate with all the female rats again and again until eventually, he became exhausted. The females continued nudging and licking him, yet he did not respond. When a novel female was introduced into the box, he became alert and began to mate once again with the new female. This phenomenon is not limited to common rats.[7] The Coolidge effect is attributed to an increase in dopamine levels and the subsequent effect upon an animal’s limbic system.[8]

Human males experience a post-ejaculatory refractory period after sex. They are temporarily incapable of engaging in sex with the same female after ejaculation and require time to recover full sexual function. In popular reference, the Coolidge effect is the well-documented phenomenon that the post-ejaculatory refractory period is reduced or eliminated if a novel female becomes available.[9] This effect is cited by evolutionary biologists as one reason why males are more likely to desire sex with a greater number and variety of partners than females,[9] though of course sometimes human females are known to copulate with multiple and novel partners as well.

While the Coolidge effect is usually seen demonstrated by males—that is, males displaying renewed excitement with a novel female—Lester and Gorzalka developed a model to determine whether or not the Coolidge effect also occurs in females. Their experiment, which used hamsters instead of rats, found that it does occur to a lesser degrees in females.[3][4]

The fact these hunter-gatherer humans so effectivly articulate these relationship phases indicates this may just be an almost inescapable side effect of long-term relationships, no matter what social norms dictate.

Paleo is the Key to smarts. Big brains require an explanation. Part IV.

Reblogged from Gnolls.org

In Part III, we established the following:

  • Bipedalism among human ancestors is associated with a dietary shift away from soft, sugar-rich fruit, and toward hard, fibrous, ground-based foods like nuts, root vegetables, insects, and mushrooms. (And perhaps some meat, though the evidence is inferential.)
  • Both bipedalism and this dietary shift occurred while our ancestors were still forest-dwellers—before we moved into savanna and grassland habitats.
  • Both bipedalism and this dietary shift precededthe massive increase in our ancestors’ brain size.
  • Therefore, neither fruit, nor potatoes, nor walking upright made us human.

Once again, I am giving what I believe to be the current consensus interpretation of the evidence…and where no consensus exists, I offer what I believe to be the most parsimonious interpretation.

(This is a multi-part series. Go back to Part I, Part II, Part III.)

A Quick Recap

4.4 million years ago, Ardipithecus ramidus still had a brain the size of a modern chimpanzee, but was a facultative biped partially adapted to a ground-based diet. By 4.1 MYA, Australopithecus anamensis had been selected for more complete dietary adaptation:

Science 2 October 2009: Vol. 326 no. 5949 pp. 69, 94-99
Paleobiological Implications of the Ardipithecus ramidus Dentition
Gen Suwa, Reiko T. Kono, Scott W. Simpson, Berhane Asfaw, C. Owen Lovejoy, Tim D. White

Ar. ramidus lacks the postcanine megadontia of Australopithecus. Its molars have thinner enamel and are functionally less durable than those of Australopithecus but lack the derived Pan pattern of thin occlusal enamel associated with ripe-fruit frugivory. The Ar. ramidus dental morphology and wear pattern are consistent with a partially terrestrial, omnivorous/frugivorous niche.”

And the Laetoli footprints show that hominins were fully bipedal by 3.7 MYA, though we have no evidence for brain size until…

Australopithecus afarensis: Upright Gait, Smaller Body, Bigger Brain

Australopithecus afarensis lived from approximately 3.9 to 2.9 MYA. (Once again, these are human-drawn distinctions between a continuum of hominin fossils.) It was slightly shorter than Ardipithecus (3’6″) and weighed much less: 65# versus 110#. The famous “Lucy” fossil is about 40% of an A. afarensis skeleton from 3.2 MYA.

One interpretation of LucyLucy might have looked like this.

Additionally, its back had a similar double curve to modern humans; its arms were shorter than Ardipithecus; its knees support an upright gait, and its feet had arches like ours—meaning that it was fully bipedal, and that A. afarensis is very likely the hominin which made the Laetoli footprints.

This is a recent finding: only last year did its discoverers announce that they had found a foot bone from A. afarensis which appears to settle this long-simmering question.

Science 11 February 2011: Vol. 331 no. 6018 pp. 750-753
Complete Fourth Metatarsal and Arches in the Foot of Australopithecus afarensis
Carol V. Ward, William H. Kimbel, and Donald C. Johanson

“A complete fourth metatarsal of A. afarensis was recently discovered at Hadar, Ethiopia. It exhibits torsion of the head relative to the base, a direct correlate of a transverse arch in humans. The orientation of the proximal and distal ends of the bone reflects a longitudinal arch. Further, the deep, flat base and tarsal facets imply that its midfoot had no ape-like midtarsal break. These features show that the A. afarensis foot was functionally like that of modern humans and support the hypothesis that this species was a committed terrestrial biped.

Most importantly, A. afarensis’ brain was much larger than Ardipithecus: 380-430cc versus 300-350cc. This means that selection pressure was favoring bigger brains as early as 4 million years ago, while allowing our ancestors’ bodies to shrink dramatically.

Now we’re getting to the meat of the problem. What could have caused this selection pressure?

“Is It Just Me, Lucy, Or Is It Getting Colder?”

During the Pliocene (5.3-2.6 MYA), the Earth’s climate—though far warmer than today’s—become cooler, drier, and more seasonal (see the temperature graphs and detailed explanation in Part I), a multi-million-year trend which began with the Middle Miocene Disruption around 14.5 MYA. Consequently, African forests were shrinking, and savannas and grasslands were growing in their place.

With less forest available to live in, some number of our ancestors faced a stark choice: adapt to living outside the forest, or die out. Those that stayed in the trees became what we know today as chimpanzees and bonobos. Those that eventually left became our ancestors—the hominins.

PNAS August 17, 2004 vol. 101 no. 33 12125-12129
High-resolution vegetation and climate change associated with Pliocene Australopithecus afarensis
R. Bonnefille, R. Potts, F. Chalié, D. Jolly, and O. Peyron

Through high-resolution pollen data from Hadar, Ethiopia, we show that the hominin Australopithecus afarensis accommodated to substantial environmental variability between 3.4 and 2.9 million years ago. A large biome shift, up to 5°C cooling, and a 200- to 300-mm/yr rainfall increase occurred just before 3.3 million years ago, which is consistent with a global marine δ18O isotopic shift.

Our results show that a diversity of biomes was available to A. afarensis. Recovery of hominin fossils through the entire stratigraphic range suggests no marked preference by A. afarensis for any single biome, including forest. Significant cooling and biome change had no obvious effect on the presence of this species through the sequence, a pattern of persistence shared by other Pliocene mammal taxa at Hadar and elsewhere (6, 27, 32). We hypothesize that A. afarensis was able to accommodate to periods of directional cooling, climate stability, and high variability.

As we found in Part I, and as we’ve seen by the chimp-sized brains of Ardipithecus, shrinking habitat does not explain increased brain size by itself—but it does provide an incentive to find ways to live in marginal habitat, or entirely different biomes. And it’s clear that bipedalism would be an advantage in forest margins and open forests, where direct travel from tree to tree wasn’t possible. In addition, more light reaching the ground would mean more food available on the ground, versus up in the tree canopy—so bipedal ground-dwelling would have been a good survival strategy in forest habitat that was marginal for a tree-dweller.

My interpretation of the evidence is that bipedalism did not cause brain expansion, but it was a necessary precondition. It allowed our ancestors to expand beyond the forest margin—and it freed up our ancestors’ hands for other tasks, such as…

How Bipedalism Enables Tool Use, Re-Use, and Manufacture

Facultative bipeds, which cannot walk on two legs for very long, can’t carry tools around with them: they must make a tool out of whatever materials exist near the point of use, and discard it soon after. Therefore, the tools they make must remain relatively simple, since they can’t spend too much time making single-use items—and it greatly constrains the raw materials they can use. (Yes, I’m ignoring any hypothesis that gives Ardipithecus ramidus the ability to construct backpacks.)

In contrast, full bipeds can carry around their tools in anticipation of needing them, and can keep them for future use. Therefore, they can spend the time and effort to make complex, reusable tools—and they can use any raw materials they have access to, not just those near the point of use.

We know that modern chimpanzees make spears, termite sticks, and other wooden tools—but is there evidence for tool use previous to the Oldowan industry, 2.6 MYA?

Recall that the Oldowan industry marks the beginning of the Paleolithic age, and happens to coincide with the beginning of the Pleistocene epoch. (If these terms are confusing you, I explain them in Part II.)

 

Rocks, Meat, and Marrow in the Pliocene

 

Nature 466, 857–860 (12 August 2010) — doi:10.1038/nature09248
Evidence for stone-tool-assisted consumption of animal tissues before 3.39 million years ago at Dikika, Ethiopia
Shannon P. McPherron, Zeresenay Alemseged, Curtis W. Marean, Jonathan G. Wynn, Denné Reed, Denis Geraads, René Bobe, Hamdallah A. Béarat

“On the basis of low-power microscopic and environmental scanning electron microscope observations, these bones show unambiguous stone-tool cut marks for flesh removal and percussion marks for marrow access. … Established 40Ar–39Ar dates on the tuffs that bracket this member constrain the finds to between 3.42 and 3.24 Myr ago, and stratigraphic scaling between these units and other geological evidence indicate that they are older than 3.39 Myr ago.”

It’s fair to say that no one knows what to do with this particular piece of evidence, so it tends to simply get ignored or dismissed. What we know is that the researchers found several ungulate and bovid bones, dated to 3.4 MYA, which were scraped and struck by rocks. The scrapes are not natural, nor are they from the teeth of predators, and they appear to date from the same time as the bones.

A bone at DikikaOne of the bones at Dikika. The reality of paleontology is far less exciting than the hypotheses it generates.

Unfortunately, no stone tools or fossil hominins were found there, so we can’t say for sure who made them. But the simplest interpretation is that a hominid used a rock to scrape meat off of the bones of large prey animals, and to break them open for marrow.

It is likely that the reason this evidence isn’t more well-accepted is because the researchers make one huge assumption: that the scrape marks were made by deliberately fashioned stone tools, 800,000 years before the first evidence we have of stone tool manufacture—even though no such tools were found.

I believe the most parsimonious interpretation is that the scrape marks were indeed made by Australopithecus afarensisusing one of the naturally-occurring volcanic rocks found in abundance in the area. Given the slow pace of technological change (millions of years passed between major changes in stone tool manufacture, and that’s for later hominins with much larger brains than A. afarensis), it would be extremely surprising if naturally-occurring sharp rocks hadn’t been used for millions of years before any hominin thought to deliberately make them sharper—

It’s Not Just The Discovery…It’s The Teaching And The Learning

—and, more importantly, before their children were able to learn the trick, understand why it was important, and pass it on to their own children.

Those of you who were able to watch the documentary “Ape Genius”, to which I linked in Part I, understand that intelligence isn’t enough to create culture. In order for culture to develop, the next generation must learn behavior from their parents and conspecifics, not by discovering it themselves—and they must pass it on to their own children. Chimpanzees can learn quite a few impressive skills…but they have little propensity to teach others, and young chimps apparently don’t understand the fundamental concept that “when I point my finger, I want you to pay attention to what I’m pointing at, not to me.”

So: the developmental plasticity to learn is at least as important as the intelligence to discover. Otherwise, each generation has to make all the same discoveries all over again. It is theorized that this plasticity is related to our less-aggressive nature compared to chimpanzees…but that’s a whole another topic for another time.

In conclusion, the Dikika evidence pushes meat-eating and stone tool-using (though not stone tool-making) back to at least 3.4 MYA, well into the Pliocene. And though we’re not sure whether that meat was obtained by hunting, scavenging, or both, we can add it to the other foods that we’re reasonably sure formed its diet to produce the following menu:

The Paleo Diet For Australopithecus afarensis

Eat all you can find of:

  • Nuts
  • Root vegetables
  • Insects
  • Mushrooms
  • Meat (particularly bone marrow)

Eat sparingly:

  • Fruit (your tooth enamel won’t withstand the acids)
  • Foliage (your teeth aren’t shaped correctly for leaf-chewing)

In other words, A. afarensis was most likely eating a diet within the existing range of modern ancestral diets—3.4 million years ago.

The only major addition to this diet previous to the appearance of anatomically modern humans is the gathering of shellfish, known from middens dated to 140 KYA at Blombos Cave.

Our Takeaway (so far)

  • Our ancestors’ dietary shift towards ground-based foods, and away from fruit, did not cause an increase in our ancestors’ brain size.
  • Bipedalism was necessary to allow an increase in our ancestors’ brain size, but did not cause the increase by itself.
  • Bipedalism allowed A. afarensis to spread beyond the forest, and freed its hands to carry tools. This coincided with a 20% increase in brain size from Ardipithecus, and a nearly 50% drop in body mass.
  • Therefore, the challenges of obtaining food in evolutionarily novel environments (outside the forest) most likely selected for intelligence, quickness, and tool use, and de-emphasized strength.
  • By 3.4 MYA, A. afarensis was most likely eating a paleo dietrecognizable, edible, and nutritious to modern humans.
  • The only new item was large animal meat (including bone marrow), which is more calorie- and nutrient-dense than any other food on the list—especially in the nutrients (e.g. animal fats, cholesterol) which make up the brain.
  • Therefore, the most parsimonious interpretation of the evidence is that the abilities to live outside the forest, and thereby to somehow procure meat from large animals, provided the selection pressure for larger brains during the middle and late Pliocene.

Live in freedom, live in beauty.

JS


Paleo is the key to heath, fitness, and looking good naked. And smarts! Part III

Reblogged from Gnolls.org

My takehome point from this article:

The result of OFT (optimal foraging theory) is, as one might hope, common sense: our ancestors would have eaten the richest, most accessible foods first.

 

Our Story So Far

  • It is not enough to state that the availability of high-quality food allowed our ancestors’ brains to increase in volume from ~400cc to ~1500cc between 2.6-3 MYA and 100-200 KYA. We must explain the selection pressuresthat caused our brains to more than triple in size—instead of simply allowing us to increase our population, or to become faster or stronger.
  • To gloss over this explanation is a teleological error. It assumes that evolution has a purpose, which is to create modern humans.
  • Climate change is most likely a factor—but it is insufficient, by itself, to create this selection pressure.
  • The Paleolithic is an age defined by the use of stone tools (“industries”) to assist in hunting and gathering. It began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry.
  • The Pleistocene began exactly 2.588 MYA and ended 11,700 BP, and is defined by the age of specific rock (or ice) formations.
  • Therefore, if we wish to locate an event precisely in time, we need to speak in terms of geological time—the Pliocene and Pleistocene epochs. If we wish to identify an event relative to human technological capability, we need to speak of cultural time—the Paleolithic age.
  • Sexual selection is a fascinating subject, but I see no need to invoke it to explain the increase in hominid brain size from the start of the Paleolithic to the rise of anatomically modern humans.

A Timeline Of Facts, A Narrative To Join Them

The factual knowledge we have about human behavior (including diet) during the Pleistocene is limited by the physical evidence we’ve discovered so far—which becomes thinner the farther back in time we go. Therefore, any narrative we construct from these facts must necessarily remain contingent on future discoveries.

However, the evidence we have strongly supports the currently accepted hypothesis for the evolution of human intelligence. I’ll do my best to compress several semesters of anthropology and evolutionary theory into a timeline that tells our ancestors’ story.

First, a key concept: in order to explain a more than tripling of brain size over nearly 3 million years, a single event is not sufficient. It’s not enough to say “Hunting is hard, so we had to get smarter.” We must postulate a sequence of events—one which creates the most parsimonious narrative from the physical evidence.

“Parsimonious” means “stingy” or “frugal”. It is frequently used by scientists as part of the phrase “the most parsimonious hypothesis/theory/explanation”, which means “the explanation which requires the least speculation and depends on the fewest unknowns.” (Also see: Occam’s razor.)

Before we start our narrative, we must define one more term: optimal foraging theory.

Optimal Foraging Theory

Optimal foraging theory (OFT) is a simple concept: “…Decisions are made such that the net rate of energy capture is maximized.” (Sheehan 2004)

This is because efficiency—obtaining more food for less effort—is rewarded by natural selection. Efficient foragers survive better during difficult times, and they spend less time exposed to the risks of foraging. This leaves them more likely to survive, and with more time to seek mates, raise offspring, or simply rest.

In the simplest case, herbivores select the most nutritious plants, and predators select the fattest, slowest herbivores. However, many complicated behaviors result from application of this simple rule. Two examples: for herbivores, leaving the herd costs energy and makes being eaten by a carnivore more likely; for predators, unsuccessful hunts cost energy and make starvation more likely.

Due to time and space constraints, we’re barely scratching the surface of OFT. This article provides a brief introduction, and Wikipedia goes into more detail—including many refinements to the basic model. For an in-depth exploration, including several interesting and complex behaviors resulting entirely from its real-world application, read this textbook chapter (PDF).

The result of OFT is, as one might hope, common sense: our ancestors would have eaten the richest, most accessible foods first.

Our Story Begins On Two Legs: Ardipithecus ramidus

Our story begins in an African forest during the Pliocene epoch, 4.4 million years ago. (Our ancestors have already parted ways with the ancestors of chimpanzees and bonobos. This occurred perhaps 6.5 MYA, in the late Miocene.)

The Miocene epoch lasted from 23 MYA to 5.3 MYA. The Pliocene epoch lasted from 5.33 to 2.59 MYA, and the Pleistocene lasted from 2.59 MYA to 11,700 BP.

It’s important to note that many different hominins existed throughout the Pliocene and Pleistocene. We aren’t absolutely certain which were directly ancestral to modern humans, and which represented stem species that subsequently died out…but the fossil record is complete enough that we’re unlikely to dig up anything which radically changes this narrative.

Though there are fascinating fossil finds which date even earlier (e.g. Orrorin), we’ll begin with Ardipithecus ramidus, a resident of what is now Ethiopia in the mid-Pliocene, 4.4 MYA. Today it’s the Afar desert—but in the Pliocene, its habitat was a lush woodland which occasionally flooded.

What Ardipithecus ramidus might have looked like. Click the picture for a BBC article.

“Ardi” was about four feet tall, with a brain the size of a modern chimpanzee (300-350cc). She was most likely what we call a facultative biped, meaning that she walked on four legs while in trees, and on two legs while on the ground: though her pelvis was adapted to walking upright, her big toe was still opposable and she had no arches, leaving her feet better adapted to gripping trees than to walking or running.

You can learn much more about Ardi at Discovery.com’s extensive and informative (though Flash-heavy and somewhat hyperbolic) website. For those with less patience or slow Internet connections, this NatGeo article contains a discussion of Ardi’s importance and possible means of locomotion. (Warning: both contain some highly speculative evolutionary psychology.)

From the evidence, we know that there must have been selection pressure to sacrifice tree-climbing ability in exchange for improved bipedal locomotion—most likely due to an increased ability to take advantage of ground-based foods. Though evidence is thin, its discoverers think (based on its teeth) that Ardi consumed a similar diet to its successor Australopithecus anamensis—nuts, root vegetables, insects, mushrooms, and some meat. (This supports the idea that Ardi ate more ground-based food, such as root vegetables and mushrooms, and less tree-based food, such as fruit.) And stable isotope analysis of its tooth enamel confirms that Ardipithecus was a forest species, lacking significant dietary input from grasses or animals that ate grasses.

Fruit Is For The Birds (And The Bats, And The Chimps): Australopithecus anamensis

Our next data point comes just a few hundred thousand years later.

“Early Humans Skipped Fruit, Went for Nuts”
Discovery News, November 9, 2009

Macho and colleague Daisuke Shimizu analyzed the teeth of Australopithecus anamensis, a hominid that lived in Africa 4.2 to 3.9 million years ago.

Based on actual tooth finds, Shimizu produced sophisticated computer models showing multiple external and internal details of the teeth. One determination was immediately clear: Unlike chimpanzees, which are fruit specialists, the hominid couldn’t have been much of a fruit-lover.

“Soft fleshy fruits tend to be acidic and do not require high bite forces to be broken down,” explained Macho. “The enamel microstructure of A. anamensis indicates that their teeth were not well equipped to cope with acid erosion, but were well adapted to masticate an abrasive and hard diet.”

The researchers therefore believe this early human ate nuts, root vegetables, insects—such as termites—and some meat. While they think certain flowering plants known as sedges might have been in the diet, Lucy and her relatives were not properly equipped for frequent leaf-chewing.

(Hat tip to Asclepius for the reference.)

Here’s the original paper:

Journal of Human Evolution Volume 57, Issue 3, September 2009, Pages 241–247
Dietary adaptations of South African australopiths: inference from enamel prism attitude
Gabriele A. Macho, Daisuke Shimizu

Unfortunately, as all we have yet found of Australopithecus anamensis are pieces of a jawbone and teeth, a fragment of a humerus, and a partial tibia (and those not even from the same individual!) we don’t know its cranial capacity. We do know that its range overlapped that of Ardipithecus—but since remains have also been found in transitional environments, it may have not been a pure forest-dweller.

Either way, it appears that our ancestors had been selected away from a fruit-based diet, and towards an omnivorous diet more compatible with savanna-dwelling, even before they left the forest.

Our Story Continues…With Footprints

This brings us to an unusual fossil find…the Laetoli footprints, left in volcanic ash 3.7 MYA, cemented by rainfall, and preserved by subsequent ashfalls. Their form and spacing shows that the hominins who made them were fully bipedal: their feet had arches and an adducted big toe, and they walked at or near human walking speed.

A footprint at LaetoliExcavating the Laetoli footprints

“Adducted” means “closer to the midline”. It means their big toe was close to their other toes, like a modern human—quite unlike the widely spaced, opposable big toe of Ardipithecus.

And though we’re not completely sure, it is generally accepted that the footprints were made by Australopithecus afarensis, the next player in our story. Here’s the original paper by Leakey and Hay, for those interested:

Nature Vol. 278, 22 March 1979, pp. 317-323
Pliocene footprints in the Laetolil Beds at Laetoli, northern Tanzania
Leakey, M. D. and Hay, R. L.

In summary, it’s clear from what we know of Ardipithecus, and Australopithecus anamensis, that bipedalism long preceded our ancestors’ move into savanna and grassland habitats. This makes sense: a clumsily-waddling knuckle-walker would stand no chance outside the safety of the forest, whereas a bipedal ape can survive in the forest so long as it retains some ability to climb trees—a talent even humans haven’t completely lost.

Furthermore, our dietary shift towards ground-based foods, and away from fruit, also preceded our ancestors’ move into savanna and grassland habitats.

Finally, and most importantly, both of these changes preceded the massive increase in our ancestors’ brain size.

The story of our origins will continue next week!

Live in freedom, live in beauty.

JS


Paleo is the key to health, fitness, and looking good naked. And smarts! Part II.

Reblogged from Gnolls.org

Let’s Get Oriented In Time: What Does “Paleolithic” Mean?

Since we’ve talking about the “paleo diet” for years, and this series explores the increased brain size and behavioral complexity that took place during the Paleolithic, I think it’s important to understand exactly what the term “Paleolithic” means. Yes, everyone knows that it happened a long time ago—but how long? And how is the Paleolithic different from the Pleistocene? What do all these terms mean, anyway?

First, Some Common Archaeology Terms And Abbreviations

BP = years Before Present. “The artifact was dated to 6200 BP.”
KYA (or ka) = thousands of years Before Present. “The bones were dated to 70 KYA.”
MYA (or ma) = millions of years Before Present. “The Permo-Triassic extinction occurred 250 MYA.”
industry = a technique that produced distinct and consistent tools throughout a span of archaeological time. Examples: the Acheulean industry, the Mousterian industry.

 

Oldowan choppersThey don’t look like much—but they were much better than fingernails or teeth at scraping meat off of bones.

The word itself is a straightforward derivation from Greek. “Paleo-” means “ancient”, and “-lithic” means “of or relating to stone”, so “Paleolithic” is just a sophisticated way to say “old rocks”. Its beginning is defined by the first stone tools known to be made by hominids, dated to approximately 2.6 MYA—the Oldowan industry—and it ends between 20,000 and 5,000 BP, with technology generally agreed to be transitional towards agriculture (the “Mesolithic” industries).

 

The Paleolithic age is further divided:

  • Lower Paleolithic: 2.6 MYA – 300 KYA. Defined by the Oldowan and Acheulean industries.
  • Middle Paleolithic: 300 KYA – 30 KYA. Defined primarily by the Mousterian and Aterian industries.
  • Upper Paleolithic: 50 KYA – between 20 and 5 KYA. Defined by a host of complex industries.(Click here for more information, including links to all the above terms.)

The reason for the imprecise ending of the Upper Paleolithic (and the overlap between Paleolithic stages) is not because there is doubt about the dates of such recent artifacts…it is because the Paleolithic is a technological boundary, not a temporal boundary, and is defined by the suite of tools in use. So for the first cultures to transition towards agriculture, the Paleolithic ended approximately 20 KYA (and was succeeded by the Mesolithic), whereas other cultures used Paleolithic technology until perhaps 5000 BP.

It’s also important to keep in mind that there are continuing definitional squabbles, particularly with the Mesolithic and Neolithic. What constitutes a Mesolithic culture vs. an Epipaleolithic culture? If a culture never takes up farming, is it still Neolithic if it uses similar tools and technology?

I don’t like to spend too much time in this morass, because it’s not an interesting argument—it’s just a failure to agree on definitions. However, it is always true that Paleolithic cultures were hunter-gatherers. Furthermore, it is almost always true that Neolithic cultures were farmers. (There are a few cases where nomadic cultures adopted Neolithic technology, such as pottery.)

So when we are speaking of a “Paleolithic diet”, we are speaking of a diet nutritionally analogous to the diet we ate during the Paleolithic age—the age during which selection pressure caused our ancestors to evolve from 3’6″, 65# australopithecines with 400cc brains into tall, gracile, big-brained, anatomically modern humans with 1400cc brains. (A figure which has decreased by roughly 10% during the last 5000 years.)

No, we can’t just ‘eat like a caveman’: the animals are mostly extinct and the plants have been bred into different forms. I discuss the issue at length in this article: The Paleo Identity Crisis: What Is The Paleo Diet, Anyway?

 

Now Let’s Orient Ourselves In Geological Time

In contrast to archaeological ages, the Pleistocene is a geological term (an “epoch”), defined precisely in time as beginning 2.588 MYA and ending 11,700 BP. It’s preceded by the Pliocene epoch (5.332 to 2.588 MYA) and followed by the Holocene epoch (11,700 BP – present).

You’ll see a lot of sources that claim the Pleistocene began 1.6 or 1.8 MYA. This is because the definition was changed in 2009 to its present date of 2.588 MYA, so as to include all of the glaciations to which I referred in Part I.

(More specifically, geological time divisions are defined by a “type section”, which is a specific place in a specific rock formation, and which is dated as precisely as possible given available technology.)

Remember, these are all just names…changing the name doesn’t alter the events of the past.

To give some idea of the time scales involved, our last common ancestor with chimps and bonobos lived perhaps 6.5 MYA, the dinosaurs died out 65.5 MYA, and Pangaea broke up 200 MYA.

Note that the middle timeline of the illustration below zooms in on the end of the top timeline, and the bottom timeline zooms in on the end of the middle timeline. Also note that the time period we’re exploring takes up one tiny box in the lower right, so small that the word “Pleistocene” doesn’t even fit inside it!

Geological timeline of the Earth, from The EconomistClick the image for a larger and more legible version, and an interesting article from The Economist.

For a slightly deeper look into the significance of each geological period, I highly recommend you click here for a graphical, interactive timeline. And here’s a long explanation of the terminology: ages, epochs, eons, and so on.

Summary: Paleolithic or Pleistocene?

The Paleolithic began approximately 2.6 MYA, with the first known stone tools, and ended between 20 KYA and 5 KYA, depending on when the local culture adopted a Mesolithic or Neolithic industry. Since it’s defined by our knowledge of hominid tool use, these dates could change in the future.

The Pleistocene began exactly 2.588 MYA and ended 11,700 BP. These dates are defined by our best estimates of the age of two specific pieces of rock (or ice) somewhere on the Earth.

So though the two terms are measuring nearly identical spans of time, they’re defined by two completely different phenomena…and since we’re speaking of human development, it is appropriate to use the term defined by human artifacts—the Paleolithic age.

Did Sexual Selection Drive The Australopithecus -> Homo Transition?

Evolutionary psychology is great fun to read about…but the problem with extrapolating it back into the Lower and Middle Paleolithic is that it’s pure speculation. The entire fossil record of this era of hominids can be itemized on one Wikipedia page, and I think it’s extremely risky to draw behavioral conclusions so far beyond the physical evidence.

More importantly, though, it’s unnecessary to invoke sexual selection in order to explain the growth in human brain size.

“Even if the survivalist theory could take us from the world of natural history to our capacities for invention, commerce, and knowledge, it cannot account for the more ornamental and enjoyable aspects of human culture: art, music, sports, drama, comedy, and political ideals.”
-Geoffrey Miller, “The Mating Mind”

While this may very well be true, the first known archaeological evidence of art (blocks of ocher engraved with abstract designs) is dated to just 75,000 years ago, at Blombos Cave in South Africa—long after our ancestors first became anatomically modern c. 200,000 years ago. (Venus figurines are much more recent: the earliest is dated to 35 KYA.)

 

The first known art: carved red ocherClick the image for more information about Blombos Cave.

 

 

The term “anatomically modern humans” refers to ancestral humans whose remains fall within the range of variations exhibited by humans today. We refer to such humans as the subspecies Homo sapiens sapiens.

Note that as with all fossil classifications, “anatomically modern” is a judgment call. There was no instant transition: a beetle-browed, heavy-limbed, archaic Homo sapien did not suddenly gave birth to Salma Hayek, and there are indeed many transitional fossils with a mix of archaic and modern features, usually known as “Early Modern Humans”.

Furthermore, the behavior of the few remaining African hunter-gatherer tribes, such as the Hadza and the Ju/wasi, supports the interpretation that sexual selection simply reinforced the same selection pressures as natural selection:

Human Nature 15:364-375.
Mate Preferences Among Hadza Hunter-Gatherers
Frank W. Marlowe

“Women placed more value on men being good foragers (85% of those women said “good hunter”) than on any other trait.”

National Geographic, December 2009
“The Hadza”
Michael Finkel

“Onwas joked to me that a Hadza man cannot marry until he has killed five baboons. […] Ngaola is quiet and introspective and a really poor hunter. He’s about 30 years old and still unmarried; bedeviled, perhaps, by the five-­baboon rule.

The Old Way: A Story Of The First People
Elizabeth Marshall Thomas

“A young man may not marry until he has killed a big game animal (preferably a large antelope, although a duiker or a steenbok will also suffice) and proved himself a hunter.”
     …
“His [/Gunda’s] victim had been only a duiker, but a duiker is plenty big enough to qualify a boy for marriage.
     …
“He [≠Toma] had few living relatives and no close ones, and thus could offer her no in-laws who could help her if the need arose, but he was an excellent hunter. This would appeal to any girl. So !U nagged her parents until they consented to the marriage.

In conclusion: the evidence is that sexual selection, if it was an important force, was providing the same selection pressure as natural selection—and that the behaviors most attributed to sexual selection postdate our evolutionary transformation into anatomically modern humans. Furthermore, it seems prudent not to invoke a factor for which our evidence is entirely speculative when there are other factors sufficient to explain our ancestors’ transformation.

Therefore, while sexual selection is a fascinating subject worthy of discussion, I don’t see a need to invoke it as a separate force to explain the increase in hominid brain size and behavioral complexity from the beginning of the Paleolithic (2.6 MYA) to the time of anatomically modern humans (200-100 KYA).

Live in freedom, live in beauty.

JS

Paleo is the key to health, fitness, and looking good naked. And smarts!

I’m going to go ahead and just re-blog this link I used in last weeks Missing Link(s), because I think it’s vitally important to understand our evolution and how we became what we are today.

It’s also important to understand why I promote Paleo lifestyle so passionately: It leads to vibrant health that follows through to old age. It prevents disease and poor gene expression. It prevents sickness and injury. And it makes you look and feel like a million bucks! I believe pretty much everyone should at least try Paleo, all out, 100% for 30 days AT LEAST. See if you don’t agree with me. Because this is what got us here. It’s what makes us thrive!

Keep in mind, this is the first part to a series. More details to come. Anyway, enough with the introduction…

Big brains require an explanation. How did humans become smarter, not just more numerous?

How did we get from this:

Australopithecus afarensis reconstructionAustralopithecus afarensis (reconstruction)

 

To both this…

Hadzabe hunting Maribou storks on the shore of Lake Eyasi, Tanzania.Hadzabe hunting Marabou storks

And this?

Shibuya Crossing 163Shibuya Crossing, Tokyo

That’s more than a tripling of brain size—and an astounding increase in cultural complexity—in under 3 million years.

I’ve previously written about the currently accepted explanation, in this article: “Why Humans Crave Fat.” Here are a few bullet points:

  • Chimpanzees consume about one McDonalds hamburger worth of meat each day during the dry season—mostly from colobus monkeys, which they hunt with great excitement and relish.
  • Kleiber’s Law states that all animals of similar body mass have similar metabolic rates, and that this rate scales at only the 3/4 power of size. Therefore, in order for our brains to grow and use more energy, something else had to shrink and use less energy.
  • It takes a much larger gut, and much more energy, to digest plant matter than it does to digest meat and fat.This is why herbivores have large, complicated guts with extra chambers (e.g. the rumen and abomasum), and carnivores have smaller, shorter, less complicated guts.
  • The caloric and nutritional density of meat allowed our mostly-frugivorous guts to shrink so that our brains could expand—and our larger brains allowed us to become better at hunting, scavenging, and making tools to help us hunt and scavenge. This positive feedback loop allowed our brains to grow from perhaps 400cc (“Lucy”, Australopithecus afarensis) to over 1500cc (late Pleistocene hunters).
  • In support of this theory, the brains of modern humans, eating a grain-based agricultural diet, have shrunk by 10% or more as compared to late Pleistocene hunters and fishers.

(For a more detailed explanation, including links, references, and illustrations, read the original article.)

The Teleological Error

When discussing human evolution, it’s easy to fall into the error of teleology—the idea that evolution has a purpose, of which intelligence (specifically, self-conscious intelligence recognizable to our modern philosophical traditions, and producing something recognizable to us as ‘civilization’) is the inevitable expression and end result.

Geology and archaeology proves this is not so. For instance, 140 million years of saurian dominance (far more than the 65 million years mammals have so far enjoyed) apparently failed to produce any dinosaur civilizations: they simply became bigger, faster, and meaner until the K-T asteroid hit.

Thus endeth the reign of the dinosaurs.Thus endeth the reign of the dinosaurs.

 

Likewise, the increased availability of rich, fatty, nutrient- and calorie-dense meat (enabled in large part by the usage of stone tools to deflesh bones, first practiced by our ancestors at least 2.6 million year ago, or MYA) does not, by itself, explain the over threefold increase in human brain size which began with the Pleistocene era, 2.6 MYA. When a climate shift brings more rain and higher, lusher grass to the African savanna, we don’t get smarter wildebeest, or even larger wildebeest. We get more wildebeest. Neither does this increase in the prey population seem to produce smarter hyenas and lions…it produces more hyenas and lions.

Contrary to their reputation, spotted hyenas are excellent hunters, and kill more of their own prey than lions do. (Many “lion kills” were actually killed by hyenas during the night—whereupon the lions steal the kill, gorge themselves, and daybreak finds the hyenas “scavenging” the carcass they killed themselves.) One 140-pound hyena is quite capable of taking down a wildebeest by itself.

So: if the ability to deflesh bones with stone tools allowed australopithecines to obtain more food, why didn’t that simply result in an increase in the Australopithecus population? Why would our ancestors have become smarter, instead of just more numerous?

The answer, of course, lies in natural selection.

Natural Selection Requires Selection Pressure

I don’t like the phrase “survival of the fittest”, because it implies some sort of independent judging. (“Congratulations, you’re the fittest of your generation! Please accept this medal from the Darwinian Enforcement Society.”)

“Natural selection” is a more useful and accurate term, because it makes no explicit judgment of how the selection occurs, or what characteristics are selected for. Some animals live, some animals die…and of those that live, some produce more offspring than others. This is a simple description of reality: it doesn’t require anyone to provide direction or purpose, nor to judge what constitutes “fitness”.

“Natural selection” still implies some sort of active agency performing the selection (I picture a giant Mother Nature squashing the slow and stupid with her thumb)—but it’s very difficult to completely avoid intentional language when discussing natural phenomena, because otherwise we’re forced into into clumsy circumlocutions and continual use of the passive voice.

(And yes, natural selection operates on plants, bacteria, and Archaea as well as on animals…it’s just clumsy to enumerate all the categories each time.)

Finally, I’m roughly equating brain size with intelligence throughout this article. This is a meaningless comparison across species, and not very meaningful for comparing individuals at a single point in time…but as behavioral complexity seems to correlate well with brain size for our ancestors throughout the Pleistocene, we can infer a meaningful relationship.

Therefore, we can see that “The availability of calorie- and nutrient-rich meat allowed our ancestors’ brains to increase in size” is not the entire story. The additional calories and nutrients could just as well have allowed us to become faster, stronger, or more numerous. For our ancestors’ brain size to increase, there must have been positive selection pressure for big brains, because big brains are metabolically expensive.

While at rest, our brains use roughly 20% of the energy required by our entire body!

In other words, the hominids with smaller brains were more likely to die, or to not leave descendants, than the hominids with larger brains.

What could have caused this selection pressure?

Ratcheting Up Selection Pressure: Climate Change and Prey Extinction

Just as “natural selection” is simply a description of reality, “selection pressure” is also a description of reality. It’s the combination of constraints that cause natural selection—by which some animals live, some die, and some reproduce more often and more successfully than others.

The selection pressure applied by one’s own species to reproductive choices—usually mate choice by females—is often called “sexual selection.” Sexual selection is, strictly speaking, part of natural selection, but it’s frequently discussed on its own because it’s so interesting and complex.

In this essay, I’m speaking primarily of the non-sexual selection parts of natural selection, for two reasons. First, because this article would expand to an unreadable size, and second, because understanding the influence of sexual selection in the Pleistocene would require an observational knowledge of behavior. Lacking time machines, anything we write is necessarily speculation.

In order for selection pressure to change, the environment of a species must change. I believe there are two strong candidate forces that would have selected for intelligence during the Pleistocene: climate change and prey extinction.

The Incredible Oscillating Polar Ice Caps: Understanding Pleistocene Climate

I’ve discussed Pleistocene climate change at length before. (Note: the Pleistocene epoch began approximately 2.6 MYa.)

“Unlike the long and consistently warm eons of the Jurassic and Cretaceous (and the Paleocene/Eocene), the Pleistocene was defined by massive climactic fluctuations, with repeated cyclic “ice ages” that pushed glaciers all the way into southern Illinois and caused sea level to rise and fall by over 100 meters, exposing and hiding several important bridges between major land masses.” –“How Glaciers Might Have Made Us Human”

Here is a chart of the estimated average surface temperature of the Earth, starting 500 MYA and ending today. Note the logarithmic time scale!

Click image for larger version.

To appreciate the magnitude and severity of Pleistocene climactic oscillation, note the tiny dip in temperature towards the right labeled “Little Ice Age”. This minor shift froze over the Baltic Sea and the Thames River, caused Swiss villages to be destroyed by glaciers, wiped out the Greenland Norse colonies, and caused famines in Europe which killed from 10% to 33% of the population, depending on the country.

Furthermore, the climate was changing very quickly by geological standards. Let’s zoom in on the Quaternary period (2.6 MYA – present), of which the Pleistocene forms the overwhelming majority (up to 11,800 years ago):

5 million years of temperature estimates from ice cores.  Cool!Click image for larger version.

Note that massive 41,000 year climactic oscillations, each far greater than the Little Ice Age, began approximately 2.7 MYA—and the first known stone tools made by hominids (the Oldowan industry) are dated to 2.6 MYA.

Coincidence? Perhaps not.

Genetic Vs. Cultural Change

The behavior of most animals (and all plants) is primarily determined by genetic factors (“instinct”, “innate behavior”)—so in order to adapt to a changing environment, selection pressure must be exerted over many generations. For a short-lived species which reproduces a new generation ever year, or every few years, it might be possible to adapt to a 41,000 year climate cycle via natural selection.

However, for a long-lived species like humans, with generations measured in decades, genetic change is most likely too slow to fully adapt. We would have had to move in search of conditions that remained as we were adapted to…

…or we would have had to alter our behavior in cultural time, not genetic time.

Culture is the ability to transfer knowledge between generations, without waiting for natural selection to kill off those unable to adapt—and it requires both general-purpose intelligence and the ability to learn and teach. While space does not permit a full discussion of these issues, I recommend the PBS documentary “Ape Genius” for an entertaining look at the differences between modern human and modern chimpanzee intelligence and learning. (And I can’t resist noting that spotted hyenas outperform chimpanzees on intelligence tests that require cooperation: more information here and here, abstract of original paper here.)

You can watch the full video of “Ape Genius” here if you are a US resident. (If not, you’ll have to find a US-based proxy server.)

However, climate change is insufficient by itself to cause the required selection pressure. The overwhelming majority of known species survived these changes—including the glacial cycles of the past 740,000 years which scoured North America down to southern Illinois on eight separate occasions—because they could approximate their usual habitat by moving. Even plants can usually disperse their seeds over enough distance to keep ahead of glaciers.

Therefore, to fully explain the selection pressures that led to modern intelligence, we must look farther…to the consequences of intelligence itself.

Look for Part 2—in which we’ll explore the relevance of all this to modern diet, nutrition, and far more—next week.

Live in freedom, live in beauty.

JS

[Credit]