Was there a "magic moment" when truly human beings first appeared?

The following post (which I’ve deliberately written in “devil’s advocate” mode), is a condensed version of a post I put up over at The Skeptical Zone recently, arguing against the Christian claim that there was a “magic moment” when the first truly human beings suddenly appeared. This condensed post is specially intended for Dr. William Lane Craig, who I know has a keen interest in the question of when truly human beings first appeared, and who tentatively identifies Adam and Eve with the first couple of the species Homo heidelbergensis (Heidelberg man), who is thought by many scientists to have been ancestral to Denisovan man, Neanderthal man and Homo sapiens. However, all readers are welcome to contribute their views, in the comments below. Anyway, here goes.

KEY POINTS

The famous Bald Man paradox, which was first posed by the Greek philosopher Eubulides of Miletus, goes like this:

A man with a full head of hair is obviously not bald. Now the removal of a single hair will not turn a non-bald man into a bald one. And yet it is obvious that a continuation of that process must eventually result in baldness.

The paradox arises because the word “bald,” like a host of other adjectives (“tall,” “rich,” “blue” and so on), is a vague one. Most scientists would say that the word “human” is equally vague, and, like the word “bald,” has no clear-cut boundaries. On this view, there never was a “first human” in the hominin lineage leading to modern humans, for the same reason that there never was a definite moment at which Prince William went bald. Evolutionary biologist Richard Dawkins explains this point with admirable lucidity here:

Or as Charles Darwin succinctly put it in his work, The Descent of Man, and Selection in Relation to Sex (1871, London: John Murray, Volume 1, 1st edition, Chapter VII, p. 235):

“Whether primeval man, when he possessed very few arts of the rudest kind, and when his power of language was extremely imperfect, would have deserved to be called man, must depend on the definition which we employ. In a series of forms graduating insensibly from some ape-like creature to man as he now exists it would be impossible to fix on any definite point when the term ‘man’ ought to be used.”

However, Judaism, Christianity and Islam all insist on a clear, black-and-white division between human beings and other animals. Humans are made in the image and likeness of God; beasts are not. Humans have spiritual, immortal souls that are made to enjoy eternity with their Creator, in Heaven; beasts will never go to Heaven. (Even Christians who believe in some sort of immortality for animals nevertheless acknowledge that only humans will get to behold God’s glory, face-to-face.) There are moral and political differences between humans and other animals, as well. Humans have inalienable rights, and in particular, an inviolable right to life; beasts, on the other hand, may be killed for food in times of necessity. (Indeed, most Christians would say that animals may be killed for food at any time.) Humans, especially when they are mature and fully developed, are morally responsible for their actions; beasts are not. We don’t sue chimps, as we don’t consider them liable for their actions, even when they do nasty things like kill individuals from neighboring communities, because we presume they can’t help it: they are merely acting on innate tendencies. And for the same reason, we believe God doesn’t punish them for the destructive things they do. There is no hell for chimpanzees – even vicious ones that cannibalize infants (as some do). Finally, human beings are believed to possess certain special powers which other animals lack. For some Christians, such as Aquinas, what distinguishes humans from other animals is the godlike faculty of reason ; for others, such as John Wesley, it is not reason, but the ability to know, love and serve God that makes us special. However, all Christians agree that humans are in a league of their own, mentally and spiritually, and that they have certain powers which the beasts lack. In other words, there is a clear-cut distinction, on a metaphysical level, between man and beast.

What this means is that even Christians who believe in evolution are nonetheless mind creationists, to borrow a term from the philosopher Daniel Dennett, who used it in his book, Darwin’s Dangerous Idea (Simon & Schuster, 1995) and his more recent paper, “Darwin’s ‘strange inversion of reasoning’” (PNAS, June 2009, 106 (Supplement 1) 10061-10065) to refer to thinkers (both theistic and atheistic) who refuse to accept that the human mind is the product of a blind, algorithmic process: natural selection. Christians believe that on a spiritual level, humanity literally sprang into existence overnight, due to the creative action of God. That is, truly human beings, who were made in the image and likeness of God, came into existence at a fixed point in time – a “magic moment,” if you like. However, anthropologists find this picture wildly implausible, for two main reasons: first, although human brainpower has increased fairly rapidly over the past four million years, science has not identified any “sudden spurts” in the evolution of the brain (unless you call 300,000 years sudden); and second, there are no less than ten distinct human abilities that could be used to draw the line between man and beast, but it turns out that all of them emerged at different points in time (so which one do you pick?), and in any case, nearly all of them (including language) emerged gradually, over many thousands of years.

Let’s go back to the first reason. Science has not identified any “sudden spurts” in the evolution of the human brain over the past four million years. While there appear to have been a couple of periods of accelerated evolution, these lasted for no less than 300,000 years. As far as we can tell from the fossil record, there were no overnight improvements in human intelligence.

Of course, we need to find a good yardstick to measure the brain’s information-processing capacity. Just as a high-capacity computer needs a lot of power to stay running, so too, a human brain needs a high metabolic rate, in order to continue functioning normally. Now, for any species of animal, the brain’s metabolic rate is mainly related to the energetic cost of the activity occurring in its synapses. For that reason, metabolic rate is widely thought to be a better measure of an animal’s cognitive ability than simply measuring its brain size.

It turns out that human brains have a pretty high metabolic rate: indeed, the human brain uses no less than 20% of the body’s energy, despite the fact that it makes up only 2% of the body’s mass. If we look at other primates, we find that apart from some small primates, which are known to have a high brain mass to body mass ratio (for example, ring-tailed lemurs [average weight: 2.2 kg] and pygmy marmosets [0.1 kg]), the brain of a typical primate typically uses only 8 to 10% of its body’s energy, while for most other mammals, it’s just 3 to 5%. So, what about the brains of human ancestors? How much energy did they use, and what were their metabolic rates? We need no longer speculate about these questions; we have the answers. As Roger Seymour, Emeritus Professor of Physiology at the University of Adelaide, Australia, explains in an online article on Real Clear Science titled, “How Smart Were Our Ancestors? Blood Flow Provides a Clue” (January 27, 2020), we now possess a handy metric for measuring the metabolic rate for the brains of human ancestors, over the last several million years. In a nutshell: the arteries that carry blood to the brain pass through small holes in the base of the skull. Bigger holes mean bigger arteries and more blood to power the brain. By measuring the size of the holes in the base of the skulls of fossil humans, we can estimate the rate of blood flow to their brains, which in turn tells us how much information they were capable of processing, just as the size of the cables indicates how much information a computer is capable of processing.

Professor Seymour and his team performed these measurements for Ardipithecus, various species of Australopithecus, Homo habilis, Homo erectus and his descendant, Heidelberg man, who’s believed by some experts to be the ancestor of both Neanderthal man and *Homo sapiens. (Others think it was Homo antecessor [see also here], who was older and somewhat smaller-brained than Heidelberg man, but whose face was more like ours. Unfortunately, we don’t yet have a complete skull of this species.)

Seymour’s 2019 study, which was conducted with colleagues at the Evolutionary Studies Institute of the University of the Witwatersrand in South Africa and reported in Proceedings of the Royal Society B (13 November 2019, https://doi.org/10.1098/rspb.2019.2208), found that for 4.4-million-year-old Ardipithecus, the internal carotid artery blood flow was less than 1 cubic centimeter per second, or about one third that of a modern chimpanzee. That suggests it wasn’t too bright. What about Australopithecus? Although Australopithecus had a brain bigger than a chimp’s, and about the size of a gorilla’s (despite having a much lighter body), it turns out that the brain of Australopithecus had only two-thirds the carotid artery blood flow of that of a chimp’s brain, and half the flow of a gorilla’s brain. Seymour concludes that Australopithecus was probably less intelligent than a living chimpanzee or gorilla. How about Homo habilis? Its carotid artery blood flow was about the same as a modern chimpanzee’s, but less than a gorilla’s, at just under 3 cubic centimeters per second. For early Homo erectus, which appeared only 500,000 years after Homo habilis , it was about 4.5 cubic centimeters per second (compared to about 3.5 for a gorilla), while for late Homo erectus, it was about 6. Surprisingly, it was a little less than 6 for Heidelberg man, who’s widely considered to be the next species on the lineage leading to modern man. And for Neanderthal man and Homo sapiens, it was around 8 cubic centimeters per second, suggesting that the Neanderthals’ intelligence roughly matched ours.

Additionally, in a 2016 paper titled, “From Australopithecus to Homo: the transition that wasn’t” (Philosophical Transactions of the Royal Society B vol. 371, issue 1698, https://doi.org/10.1098/rstb.2015.0248), authors William H. Kimbel and Brian Villmoare take aim at the idea that the transition from Australopithecus to Homo was a momentous one, arguing instead that “the expanded brain size, human-like wrist and hand anatomy [97,98], dietary eclecticism [99] and potential tool-making capabilities of ‘generalized’ australopiths root the Homo lineage in ancient hominin adaptive trends, suggesting that the ‘transition’ from Australopithecus to Homo may not have been that much of a transition at all.” In Figure 5 of their article, the authors graph early hominin brain sizes (endocranial volumes, or ECVs) over time, from 3.2 to 1.0 million years ago, for various specimens of Australopithecus (labeled as A), early Homo (labeled H) and Homo erectus (labeled E). From the graph, it can be readily seen that there is a considerable overlap in brain size between Homo erectus and early Homo, shattering the myth of a quantum leap between the two species. Kimbel and Villmoare add that “brain size in early Homo is highly variable—even within fairly narrow time bands—with some early H. erectus crania (e.g. D4500) falling into the Australopithecus range” and conclude that “a major ‘grade-level’ leap in brain size with the advent of H. erectus is probably illusory.”

Finally, a 2018 article titled, Pattern and process in hominin brain size evolution are scale-dependent by Andrew Du et al. (Proceedings of the Royal Society B 285:20172738, http://doi.org/10.1098/rspb.2017.2738) provides clinching evidence against any sudden spurts in brain size. In the article, the authors make use of endocranial volume (ECV), which they refer to as “a reliable proxy for brain size in fossils.” Looking at hominins overall, they find that “the dominant signal is consistent with a gradual increase in brain size,” adding that this gradual trend “appears to have been generated primarily by processes operating within hypothesized lineages,” rather than at the time when new species emerged. Du et al. considered various possible models of ECV change over time for hominins, including random walk, gradualism, stasis, punctuated equilibrium, stasis combined with random walk and stasis combined with gradualism. What they found was that gradualism was the best model for explaining the trends observed.

Readers who would like more details can find out more here.

In the light of these findings, Christian apologists need to squarely address the question: “Where do you draw the line between true human beings and their bestial forebears?” The fact is, there isn’t a good place to draw one. If you want to say that only Neanderthals and Homo sapiens were truly human, then an awkward consequence follows: their common ancestor, Heidelberg man, wasn’t human, which means that God created two distinct races of intelligent beings – or three if you include Denisovan man, another descendant of Heidelberg man. (Currently, we don’t have any complete skulls of Denisovan man.) Two or three races of intelligent beings? That doesn’t comport with what the Bible teaches or with what the Christian Church has taught, down the ages: only one race of beings (human beings) was made in God’s image (see for instance Genesis 3:20, Malachi 2:10 and Acts 17:26). If you insist that Heidelberg man must have been human as well, then you also have to include late Homo erectus, whose brain had a metabolic rate equal to that of Heidelberg man. But if you are willing to concede that late Homo erectus was truly human, then why not early Homo erectus, who belonged to the same species, after all? However, if you include early Homo erectus within your definition of “truly human,” then you have to address the question: why are you setting the bar so low, by including a species that was not much smarter than a gorilla when it first appeared, used only pebble tools for the first 200,000 years of its existence (from 1.9 to 1.7 million years ago), and only gradually became smarter, over a period of one-and-a-half million years?

Having shown that the anthropological evidence seems to favor a gradual increase in human intelligence, I’d now like to address the second reason why scientists are highly skeptical of the hypothesis that there was a “magic moment” at which our ancestors became human, in the true sense of the word: namely, that the archaeological evidence seems to tell a different story. When scientists examine the archaeological record for signs of the emergence of creatures with uniquely human abilities, what they find is that there are no less than ten distinct abilities that could be used to draw the line between humans and their pre-human forebears. However, it turns out that these ten abilities emerged at different points in time, and what’s more, most of them emerged gradually, leaving no room for a “magic moment” when the first true human beings appeared. I refer to this as the “Ten Adams” problem. [Please note that the “ten Adams” whom I refer to below are purely hypothetical figures, intended to designate the inventors (whoever they were) of ten cultural breakthroughs that changed our lives as human beings.]

I’m going to give each of these Adams a special name: first, Acheulean Adam, the maker of Acheulean hand-axes and other Mode II tools; second, Fire-controller Adam, who was able to not only make opportunistic use of the power of fire, but also control it; third, Aesthetic Adam, who was capable of fashioning elegantly symmetrical and finely finished tools; fourth, Geometrical Adam, who carved abstract geometrical patterns on pieces of shell; fifth, Spear-maker Adam, who hunted big game with stone-tipped spears – a feat which required co-operative, strategic planning; sixth, Craftsman Adam, who was capable of fashioning a wide variety of tools, known as Mode III tools, using highly refined techniques; seventh, Modern or Symbolic Adam, who was capable of abstract thinking, long-range planning and behavioral innovation, and who decorated himself with jewelry, an indication of symbolic behavior; eighth, Linguistic Adam, the first to use human language; ninth, Ethical Adam, the first hominin to display genuine altruism; and tenth, Religious Adam, the first to worship a Reality higher than himself.

Here’s a short summary of my findings, in tabular form. [Note for the benefit of my non-scientist readers: hominins are defined as creatures such as Ardipithecus, Australopithecus, Paranthropus and Homo, which belong to the lineage of primates that broke away from the chimpanzee line and led eventually to modern humans, although many branches died out along the way, without leaving any descendants.]

The TEN ADAMS
Which Adam? Which species exhibited this ability first? When?
1. Acheulean Adam, the maker of Acheulean hand-axes Homo ergaster (Africa), Homo erectus (Eurasia). (Handaxes were later used by Heidelberg man and even early Homo sapiens .) 1.76 million years ago in Africa; over 350,000 years later in Eurasia. By 1 million years ago, the shape and size of the tools were carefully planned, with a specific goal in mind. [N.B. Recently, a study using brain-imaging techniques has shown that hominins were probably taught how to make Acheulean hand-axes by non-verbal instructions, rather than by using language.]
2. Fire Controller Adam, the first hominin to control fire Homo ergaster (Africa), Homo erectus (Eurasia). 1 million years ago (control of fire; opportunistic use of fire goes back 1.5 million years); 800,000 to 400,000 years ago (ability to control fire on a regular and habitual basis; later in Europe). Date unknown for the ability to manufacture fire, but possibly less than 100,000 years ago, as the Neanderthals evidently lacked this capacity.
3. Aesthetic Adam, the first to make undeniably aesthetic objects Late Homo ergaster/erectus . 750,000-800,000 years ago (first elegantly symmetric handaxes; sporadic); 500,000 years ago (production of aesthetic handaxes on a regular basis).
4. Geometrical Adam, maker of the first geometrical designs Late Homo erectus 540,000 years ago (zigzags); 350,000-400,000 years ago (parallel and radiating lines); 290,000 years ago (cupules, or circular cup marks carved in rocks); 70,000-100,000 years ago (cross-hatched designs).
5. Spearmaker Adam, the maker of stone-tipped spears used to hunt big game Heidelberg man 500,000 years ago (first stone-tipped spears; wooden spears are at least as old, if not older); 300,000 years ago ( projectile stone-tipped spears, which could be thrown); 70,000 years ago (compound adhesives used for hafting stone tips onto a wooden shaft).
6. Craftsman Adam, the maker of Mode III tools requiring highly refined techniques to manufacture Heidelberg man (first appearance); Homo sapiens and Neanderthal man (production on a regular basis). 500,000-615,000 years ago (first appearance; sporadic); 320,000 years ago (production on a regular basis).
7. Modern or Symbolic Adam, the first hominin to engage in either modern human behavior (broadly defined), or more narrowly, symbolic behavior Homo sapiens and Neanderthal man (modern human behavior, broadly defined); Homo sapiens and Neanderthal man (symbolic behavior, in the narrow sense). 300,000 years ago (modern human behavior – i.e. abstract thinking; planning depth; behavioral, economic and technological innovativeness; and possibly, symbolic cognition); 130,000 years ago (symbolic behavior, in the narrow sense). (Note: the pace of technical and cultural innovation appears to have picked up between 40,000 and 50,000 years ago, probably for demographic reasons: an increase in the population increased the flow of ideas.)
8. Linguistic Adam, the first hominin to use language, whether broadly defined as the ability to make an infinite number of sentences or more narrowly, as a hierarchical syntactical structure Heidelberg man(?), Homo sapiens and Neanderthal man (language in the broad sense); Homo sapiens (language in the narrow sense). 500,000 years ago (language in the broad sense: sounds are assigned definite meanings, but words can be combined freely to make an infinite number of possible sentences); 70,000 to 200,000 years ago (language in the narrow sense: hierarchical syntactical structure). Note: however you define language, it almost certainly did not appear overnight (see here and here).
9. Ethical Adam, the first hominin to display genuine altruism (long-term care of sick individuals) and self-sacrifice for the good of the group Homo ergaster (altruism); late Homo ergaster/erectus or Heidelberg man (self-sacrifice). Altruism: 1,500,000 years ago (long-term care of seriously ill adults); at least 500,000 years ago (care for children with serious congenital abnormalities). Self-sacrifice for the good of the group: up to 700,000 years ago.
10. Religious Adam, the first hominin to have a belief in the supernatural Homo sapiens . 90,000 to 35,000 years ago (belief in an after-life); 35,000 to 11,000 years ago (worship of gods and goddesses). (N.B. As these ideas and beliefs are found in virtually all human societies, they must presumably go back at least 70,000 years, when the last wave of Homo sapiens left Africa.)

Readers who would like more details are welcome to go here.

To sum up: regardless of the origin of whether we look at the evidence from prehistoric human brains, or the tools and implements made by prehistoric people, or the origin of language, or the appearance of religion and morality, one conclusion seems inescapable: there was no magic moment at which humans first appeared. And if that’s right, the Christian doctrine of the Fall of our first parents (Adam and Eve) goes out the window, because we didn’t have a set of first parents, or even a first human community of (say) 10,000 people, from whom we are all descended. And without the Fall, what becomes of the doctrine of the Redemption?

Finally, recent attempts by Christian theologians to evade the force of this problem by redefining the “image of God” not in terms of our uniquely human abilities (the substantive view of the Divine image), but in terms of our being in an active relationship with God (the relational view) or our special calling to rule over God’s creation (the vocational or functional view) completely fail, because neither our relationship with God nor our domination over other creatures defines what it means to be human: they are both consequences of being human. What’s more, humans only possess these qualities because of underlying abilities which make them possible (e.g. the ability to form a concept of God or the ability to invent technologies that give us control over nature), so we are back with the substantive view, which, as we have seen, supports gradualism. The verdict is clear: contrary to Christian teaching, no sharp line can be drawn between man and beast, which means that the Biblical story of a first human being is a myth, with no foundation in reality. And if there was no first Adam of any kind, then the Christian portrayal of Jesus as the second Adam (1 Corinthians 15:45-49) makes no sense, either.

Human origins therefore pose a potentially fatal problem to the credibility of Christianity – a problem to which Christian apologists have paid insufficient attention. They ignore it at their peril.

Postscript: Was Adam an apatheist?

In a recent panel discussion at the Human Origins Workshop 2020 hosted by Dr. Anjeanette Roberts and attended by Dr. William Lane Craig, Dr. Steve Schaffner, Dr. Nathan Lents and Dr. Fazale Rana, Christian apologist Dr. William Lane Craig cited a paper by Dr. Sally McBrearty and Dr. Alison Brooks, arguing that modern human behavior did not suddenly emerge 40,000 to 50,000 years ago, but actually goes back to the African Middle Stone Age, around 300,000 years ago. Dr. Craig urged Dr. Rana to acknowledge this fact. I was pleased to see that Dr. Craig is au fait with the literature on the subject, and I personally agree with his view that truly human beings have existed for hundreds of thousands of years, but I would also point out that as far as I have been able to ascertain, religion (as opposed to mere ritual behavior) goes back no more than 100,000 years. What that means is that if the first true humans emerged at least 300,000 years ago, then for more than two-thirds of human history, humans have been areligious – that is, devoid of any belief, disbelief or even interest in the supernatural. People like this are sometimes called apatheists.

There’s a beautiful passage in Chapter XIX of Huckleberry Finn, where Huck describes the joys of living on a raft and floating down the Mississippi, with Jim, and discusses whether the stars were made or “just happened”:

“We had the sky up there, all speckled with stars, and we used to lay on our backs and look up at them, and discuss about whether they was made or only just happened. Jim he allowed they was made, but I allowed they happened; I judged it would have took too long to make so many. Jim said the moon could a laid them; well, that looked kind of reasonable, so I didn’t say nothing against it, because I’ve seen a frog lay most as many, so of course it could be done. We used to watch the stars that fell, too, and see them streak down. Jim allowed they’d got spoiled and was hove out of the nest.”

It would appear that the earliest true humans were not capable of a conversation like that. Humans were behaviorally modern, as well as being moral, long before they were religious.

3 Likes