Was there a "magic moment" when truly human beings first appeared?

The following post (which I’ve deliberately written in “devil’s advocate” mode), is a condensed version of a post I put up over at The Skeptical Zone recently, arguing against the Christian claim that there was a “magic moment” when the first truly human beings suddenly appeared. This condensed post is specially intended for Dr. William Lane Craig, who I know has a keen interest in the question of when truly human beings first appeared, and who tentatively identifies Adam and Eve with the first couple of the species Homo heidelbergensis (Heidelberg man), who is thought by many scientists to have been ancestral to Denisovan man, Neanderthal man and Homo sapiens. However, all readers are welcome to contribute their views, in the comments below. Anyway, here goes.

KEY POINTS

The famous Bald Man paradox, which was first posed by the Greek philosopher Eubulides of Miletus, goes like this:

A man with a full head of hair is obviously not bald. Now the removal of a single hair will not turn a non-bald man into a bald one. And yet it is obvious that a continuation of that process must eventually result in baldness.

The paradox arises because the word “bald,” like a host of other adjectives (“tall,” “rich,” “blue” and so on), is a vague one. Most scientists would say that the word “human” is equally vague, and, like the word “bald,” has no clear-cut boundaries. On this view, there never was a “first human” in the hominin lineage leading to modern humans, for the same reason that there never was a definite moment at which Prince William went bald. Evolutionary biologist Richard Dawkins explains this point with admirable lucidity here:

Or as Charles Darwin succinctly put it in his work, The Descent of Man, and Selection in Relation to Sex (1871, London: John Murray, Volume 1, 1st edition, Chapter VII, p. 235):

“Whether primeval man, when he possessed very few arts of the rudest kind, and when his power of language was extremely imperfect, would have deserved to be called man, must depend on the definition which we employ. In a series of forms graduating insensibly from some ape-like creature to man as he now exists it would be impossible to fix on any definite point when the term ‘man’ ought to be used.”

However, Judaism, Christianity and Islam all insist on a clear, black-and-white division between human beings and other animals. Humans are made in the image and likeness of God; beasts are not. Humans have spiritual, immortal souls that are made to enjoy eternity with their Creator, in Heaven; beasts will never go to Heaven. (Even Christians who believe in some sort of immortality for animals nevertheless acknowledge that only humans will get to behold God’s glory, face-to-face.) There are moral and political differences between humans and other animals, as well. Humans have inalienable rights, and in particular, an inviolable right to life; beasts, on the other hand, may be killed for food in times of necessity. (Indeed, most Christians would say that animals may be killed for food at any time.) Humans, especially when they are mature and fully developed, are morally responsible for their actions; beasts are not. We don’t sue chimps, as we don’t consider them liable for their actions, even when they do nasty things like kill individuals from neighboring communities, because we presume they can’t help it: they are merely acting on innate tendencies. And for the same reason, we believe God doesn’t punish them for the destructive things they do. There is no hell for chimpanzees – even vicious ones that cannibalize infants (as some do). Finally, human beings are believed to possess certain special powers which other animals lack. For some Christians, such as Aquinas, what distinguishes humans from other animals is the godlike faculty of reason ; for others, such as John Wesley, it is not reason, but the ability to know, love and serve God that makes us special. However, all Christians agree that humans are in a league of their own, mentally and spiritually, and that they have certain powers which the beasts lack. In other words, there is a clear-cut distinction, on a metaphysical level, between man and beast.

What this means is that even Christians who believe in evolution are nonetheless mind creationists, to borrow a term from the philosopher Daniel Dennett, who used it in his book, Darwin’s Dangerous Idea (Simon & Schuster, 1995) and his more recent paper, “Darwin’s ‘strange inversion of reasoning’” (PNAS, June 2009, 106 (Supplement 1) 10061-10065) to refer to thinkers (both theistic and atheistic) who refuse to accept that the human mind is the product of a blind, algorithmic process: natural selection. Christians believe that on a spiritual level, humanity literally sprang into existence overnight, due to the creative action of God. That is, truly human beings, who were made in the image and likeness of God, came into existence at a fixed point in time – a “magic moment,” if you like. However, anthropologists find this picture wildly implausible, for two main reasons: first, although human brainpower has increased fairly rapidly over the past four million years, science has not identified any “sudden spurts” in the evolution of the brain (unless you call 300,000 years sudden); and second, there are no less than ten distinct human abilities that could be used to draw the line between man and beast, but it turns out that all of them emerged at different points in time (so which one do you pick?), and in any case, nearly all of them (including language) emerged gradually, over many thousands of years.

Let’s go back to the first reason. Science has not identified any “sudden spurts” in the evolution of the human brain over the past four million years. While there appear to have been a couple of periods of accelerated evolution, these lasted for no less than 300,000 years. As far as we can tell from the fossil record, there were no overnight improvements in human intelligence.

Of course, we need to find a good yardstick to measure the brain’s information-processing capacity. Just as a high-capacity computer needs a lot of power to stay running, so too, a human brain needs a high metabolic rate, in order to continue functioning normally. Now, for any species of animal, the brain’s metabolic rate is mainly related to the energetic cost of the activity occurring in its synapses. For that reason, metabolic rate is widely thought to be a better measure of an animal’s cognitive ability than simply measuring its brain size.

It turns out that human brains have a pretty high metabolic rate: indeed, the human brain uses no less than 20% of the body’s energy, despite the fact that it makes up only 2% of the body’s mass. If we look at other primates, we find that apart from some small primates, which are known to have a high brain mass to body mass ratio (for example, ring-tailed lemurs [average weight: 2.2 kg] and pygmy marmosets [0.1 kg]), the brain of a typical primate typically uses only 8 to 10% of its body’s energy, while for most other mammals, it’s just 3 to 5%. So, what about the brains of human ancestors? How much energy did they use, and what were their metabolic rates? We need no longer speculate about these questions; we have the answers. As Roger Seymour, Emeritus Professor of Physiology at the University of Adelaide, Australia, explains in an online article on Real Clear Science titled, “How Smart Were Our Ancestors? Blood Flow Provides a Clue” (January 27, 2020), we now possess a handy metric for measuring the metabolic rate for the brains of human ancestors, over the last several million years. In a nutshell: the arteries that carry blood to the brain pass through small holes in the base of the skull. Bigger holes mean bigger arteries and more blood to power the brain. By measuring the size of the holes in the base of the skulls of fossil humans, we can estimate the rate of blood flow to their brains, which in turn tells us how much information they were capable of processing, just as the size of the cables indicates how much information a computer is capable of processing.

Professor Seymour and his team performed these measurements for Ardipithecus, various species of Australopithecus, Homo habilis, Homo erectus and his descendant, Heidelberg man, who’s believed by some experts to be the ancestor of both Neanderthal man and *Homo sapiens. (Others think it was Homo antecessor [see also here], who was older and somewhat smaller-brained than Heidelberg man, but whose face was more like ours. Unfortunately, we don’t yet have a complete skull of this species.)

Seymour’s 2019 study, which was conducted with colleagues at the Evolutionary Studies Institute of the University of the Witwatersrand in South Africa and reported in Proceedings of the Royal Society B (13 November 2019, https://doi.org/10.1098/rspb.2019.2208), found that for 4.4-million-year-old Ardipithecus, the internal carotid artery blood flow was less than 1 cubic centimeter per second, or about one third that of a modern chimpanzee. That suggests it wasn’t too bright. What about Australopithecus? Although Australopithecus had a brain bigger than a chimp’s, and about the size of a gorilla’s (despite having a much lighter body), it turns out that the brain of Australopithecus had only two-thirds the carotid artery blood flow of that of a chimp’s brain, and half the flow of a gorilla’s brain. Seymour concludes that Australopithecus was probably less intelligent than a living chimpanzee or gorilla. How about Homo habilis? Its carotid artery blood flow was about the same as a modern chimpanzee’s, but less than a gorilla’s, at just under 3 cubic centimeters per second. For early Homo erectus, which appeared only 500,000 years after Homo habilis , it was about 4.5 cubic centimeters per second (compared to about 3.5 for a gorilla), while for late Homo erectus, it was about 6. Surprisingly, it was a little less than 6 for Heidelberg man, who’s widely considered to be the next species on the lineage leading to modern man. And for Neanderthal man and Homo sapiens, it was around 8 cubic centimeters per second, suggesting that the Neanderthals’ intelligence roughly matched ours.

Additionally, in a 2016 paper titled, “From Australopithecus to Homo: the transition that wasn’t” (Philosophical Transactions of the Royal Society B vol. 371, issue 1698, https://doi.org/10.1098/rstb.2015.0248), authors William H. Kimbel and Brian Villmoare take aim at the idea that the transition from Australopithecus to Homo was a momentous one, arguing instead that “the expanded brain size, human-like wrist and hand anatomy [97,98], dietary eclecticism [99] and potential tool-making capabilities of ‘generalized’ australopiths root the Homo lineage in ancient hominin adaptive trends, suggesting that the ‘transition’ from Australopithecus to Homo may not have been that much of a transition at all.” In Figure 5 of their article, the authors graph early hominin brain sizes (endocranial volumes, or ECVs) over time, from 3.2 to 1.0 million years ago, for various specimens of Australopithecus (labeled as A), early Homo (labeled H) and Homo erectus (labeled E). From the graph, it can be readily seen that there is a considerable overlap in brain size between Homo erectus and early Homo, shattering the myth of a quantum leap between the two species. Kimbel and Villmoare add that “brain size in early Homo is highly variable—even within fairly narrow time bands—with some early H. erectus crania (e.g. D4500) falling into the Australopithecus range” and conclude that “a major ‘grade-level’ leap in brain size with the advent of H. erectus is probably illusory.”

Finally, a 2018 article titled, Pattern and process in hominin brain size evolution are scale-dependent by Andrew Du et al. (Proceedings of the Royal Society B 285:20172738, http://doi.org/10.1098/rspb.2017.2738) provides clinching evidence against any sudden spurts in brain size. In the article, the authors make use of endocranial volume (ECV), which they refer to as “a reliable proxy for brain size in fossils.” Looking at hominins overall, they find that “the dominant signal is consistent with a gradual increase in brain size,” adding that this gradual trend “appears to have been generated primarily by processes operating within hypothesized lineages,” rather than at the time when new species emerged. Du et al. considered various possible models of ECV change over time for hominins, including random walk, gradualism, stasis, punctuated equilibrium, stasis combined with random walk and stasis combined with gradualism. What they found was that gradualism was the best model for explaining the trends observed.

Readers who would like more details can find out more here.

In the light of these findings, Christian apologists need to squarely address the question: “Where do you draw the line between true human beings and their bestial forebears?” The fact is, there isn’t a good place to draw one. If you want to say that only Neanderthals and Homo sapiens were truly human, then an awkward consequence follows: their common ancestor, Heidelberg man, wasn’t human, which means that God created two distinct races of intelligent beings – or three if you include Denisovan man, another descendant of Heidelberg man. (Currently, we don’t have any complete skulls of Denisovan man.) Two or three races of intelligent beings? That doesn’t comport with what the Bible teaches or with what the Christian Church has taught, down the ages: only one race of beings (human beings) was made in God’s image (see for instance Genesis 3:20, Malachi 2:10 and Acts 17:26). If you insist that Heidelberg man must have been human as well, then you also have to include late Homo erectus, whose brain had a metabolic rate equal to that of Heidelberg man. But if you are willing to concede that late Homo erectus was truly human, then why not early Homo erectus, who belonged to the same species, after all? However, if you include early Homo erectus within your definition of “truly human,” then you have to address the question: why are you setting the bar so low, by including a species that was not much smarter than a gorilla when it first appeared, used only pebble tools for the first 200,000 years of its existence (from 1.9 to 1.7 million years ago), and only gradually became smarter, over a period of one-and-a-half million years?

Having shown that the anthropological evidence seems to favor a gradual increase in human intelligence, I’d now like to address the second reason why scientists are highly skeptical of the hypothesis that there was a “magic moment” at which our ancestors became human, in the true sense of the word: namely, that the archaeological evidence seems to tell a different story. When scientists examine the archaeological record for signs of the emergence of creatures with uniquely human abilities, what they find is that there are no less than ten distinct abilities that could be used to draw the line between humans and their pre-human forebears. However, it turns out that these ten abilities emerged at different points in time, and what’s more, most of them emerged gradually, leaving no room for a “magic moment” when the first true human beings appeared. I refer to this as the “Ten Adams” problem. [Please note that the “ten Adams” whom I refer to below are purely hypothetical figures, intended to designate the inventors (whoever they were) of ten cultural breakthroughs that changed our lives as human beings.]

I’m going to give each of these Adams a special name: first, Acheulean Adam, the maker of Acheulean hand-axes and other Mode II tools; second, Fire-controller Adam, who was able to not only make opportunistic use of the power of fire, but also control it; third, Aesthetic Adam, who was capable of fashioning elegantly symmetrical and finely finished tools; fourth, Geometrical Adam, who carved abstract geometrical patterns on pieces of shell; fifth, Spear-maker Adam, who hunted big game with stone-tipped spears – a feat which required co-operative, strategic planning; sixth, Craftsman Adam, who was capable of fashioning a wide variety of tools, known as Mode III tools, using highly refined techniques; seventh, Modern or Symbolic Adam, who was capable of abstract thinking, long-range planning and behavioral innovation, and who decorated himself with jewelry, an indication of symbolic behavior; eighth, Linguistic Adam, the first to use human language; ninth, Ethical Adam, the first hominin to display genuine altruism; and tenth, Religious Adam, the first to worship a Reality higher than himself.

Here’s a short summary of my findings, in tabular form. [Note for the benefit of my non-scientist readers: hominins are defined as creatures such as Ardipithecus, Australopithecus, Paranthropus and Homo, which belong to the lineage of primates that broke away from the chimpanzee line and led eventually to modern humans, although many branches died out along the way, without leaving any descendants.]

The TEN ADAMS
Which Adam? Which species exhibited this ability first? When?
1. Acheulean Adam, the maker of Acheulean hand-axes Homo ergaster (Africa), Homo erectus (Eurasia). (Handaxes were later used by Heidelberg man and even early Homo sapiens .) 1.76 million years ago in Africa; over 350,000 years later in Eurasia. By 1 million years ago, the shape and size of the tools were carefully planned, with a specific goal in mind. [N.B. Recently, a study using brain-imaging techniques has shown that hominins were probably taught how to make Acheulean hand-axes by non-verbal instructions, rather than by using language.]
2. Fire Controller Adam, the first hominin to control fire Homo ergaster (Africa), Homo erectus (Eurasia). 1 million years ago (control of fire; opportunistic use of fire goes back 1.5 million years); 800,000 to 400,000 years ago (ability to control fire on a regular and habitual basis; later in Europe). Date unknown for the ability to manufacture fire, but possibly less than 100,000 years ago, as the Neanderthals evidently lacked this capacity.
3. Aesthetic Adam, the first to make undeniably aesthetic objects Late Homo ergaster/erectus . 750,000-800,000 years ago (first elegantly symmetric handaxes; sporadic); 500,000 years ago (production of aesthetic handaxes on a regular basis).
4. Geometrical Adam, maker of the first geometrical designs Late Homo erectus 540,000 years ago (zigzags); 350,000-400,000 years ago (parallel and radiating lines); 290,000 years ago (cupules, or circular cup marks carved in rocks); 70,000-100,000 years ago (cross-hatched designs).
5. Spearmaker Adam, the maker of stone-tipped spears used to hunt big game Heidelberg man 500,000 years ago (first stone-tipped spears; wooden spears are at least as old, if not older); 300,000 years ago ( projectile stone-tipped spears, which could be thrown); 70,000 years ago (compound adhesives used for hafting stone tips onto a wooden shaft).
6. Craftsman Adam, the maker of Mode III tools requiring highly refined techniques to manufacture Heidelberg man (first appearance); Homo sapiens and Neanderthal man (production on a regular basis). 500,000-615,000 years ago (first appearance; sporadic); 320,000 years ago (production on a regular basis).
7. Modern or Symbolic Adam, the first hominin to engage in either modern human behavior (broadly defined), or more narrowly, symbolic behavior Homo sapiens and Neanderthal man (modern human behavior, broadly defined); Homo sapiens and Neanderthal man (symbolic behavior, in the narrow sense). 300,000 years ago (modern human behavior – i.e. abstract thinking; planning depth; behavioral, economic and technological innovativeness; and possibly, symbolic cognition); 130,000 years ago (symbolic behavior, in the narrow sense). (Note: the pace of technical and cultural innovation appears to have picked up between 40,000 and 50,000 years ago, probably for demographic reasons: an increase in the population increased the flow of ideas.)
8. Linguistic Adam, the first hominin to use language, whether broadly defined as the ability to make an infinite number of sentences or more narrowly, as a hierarchical syntactical structure Heidelberg man(?), Homo sapiens and Neanderthal man (language in the broad sense); Homo sapiens (language in the narrow sense). 500,000 years ago (language in the broad sense: sounds are assigned definite meanings, but words can be combined freely to make an infinite number of possible sentences); 70,000 to 200,000 years ago (language in the narrow sense: hierarchical syntactical structure). Note: however you define language, it almost certainly did not appear overnight (see here and here).
9. Ethical Adam, the first hominin to display genuine altruism (long-term care of sick individuals) and self-sacrifice for the good of the group Homo ergaster (altruism); late Homo ergaster/erectus or Heidelberg man (self-sacrifice). Altruism: 1,500,000 years ago (long-term care of seriously ill adults); at least 500,000 years ago (care for children with serious congenital abnormalities). Self-sacrifice for the good of the group: up to 700,000 years ago.
10. Religious Adam, the first hominin to have a belief in the supernatural Homo sapiens . 90,000 to 35,000 years ago (belief in an after-life); 35,000 to 11,000 years ago (worship of gods and goddesses). (N.B. As these ideas and beliefs are found in virtually all human societies, they must presumably go back at least 70,000 years, when the last wave of Homo sapiens left Africa.)

Readers who would like more details are welcome to go here.

To sum up: regardless of the origin of whether we look at the evidence from prehistoric human brains, or the tools and implements made by prehistoric people, or the origin of language, or the appearance of religion and morality, one conclusion seems inescapable: there was no magic moment at which humans first appeared. And if that’s right, the Christian doctrine of the Fall of our first parents (Adam and Eve) goes out the window, because we didn’t have a set of first parents, or even a first human community of (say) 10,000 people, from whom we are all descended. And without the Fall, what becomes of the doctrine of the Redemption?

Finally, recent attempts by Christian theologians to evade the force of this problem by redefining the “image of God” not in terms of our uniquely human abilities (the substantive view of the Divine image), but in terms of our being in an active relationship with God (the relational view) or our special calling to rule over God’s creation (the vocational or functional view) completely fail, because neither our relationship with God nor our domination over other creatures defines what it means to be human: they are both consequences of being human. What’s more, humans only possess these qualities because of underlying abilities which make them possible (e.g. the ability to form a concept of God or the ability to invent technologies that give us control over nature), so we are back with the substantive view, which, as we have seen, supports gradualism. The verdict is clear: contrary to Christian teaching, no sharp line can be drawn between man and beast, which means that the Biblical story of a first human being is a myth, with no foundation in reality. And if there was no first Adam of any kind, then the Christian portrayal of Jesus as the second Adam (1 Corinthians 15:45-49) makes no sense, either.

Human origins therefore pose a potentially fatal problem to the credibility of Christianity – a problem to which Christian apologists have paid insufficient attention. They ignore it at their peril.

Postscript: Was Adam an apatheist?

In a recent panel discussion at the Human Origins Workshop 2020 hosted by Dr. Anjeanette Roberts and attended by Dr. William Lane Craig, Dr. Steve Schaffner, Dr. Nathan Lents and Dr. Fazale Rana, Christian apologist Dr. William Lane Craig cited a paper by Dr. Sally McBrearty and Dr. Alison Brooks, arguing that modern human behavior did not suddenly emerge 40,000 to 50,000 years ago, but actually goes back to the African Middle Stone Age, around 300,000 years ago. Dr. Craig urged Dr. Rana to acknowledge this fact. I was pleased to see that Dr. Craig is au fait with the literature on the subject, and I personally agree with his view that truly human beings have existed for hundreds of thousands of years, but I would also point out that as far as I have been able to ascertain, religion (as opposed to mere ritual behavior) goes back no more than 100,000 years. What that means is that if the first true humans emerged at least 300,000 years ago, then for more than two-thirds of human history, humans have been areligious – that is, devoid of any belief, disbelief or even interest in the supernatural. People like this are sometimes called apatheists.

There’s a beautiful passage in Chapter XIX of Huckleberry Finn, where Huck describes the joys of living on a raft and floating down the Mississippi, with Jim, and discusses whether the stars were made or “just happened”:

“We had the sky up there, all speckled with stars, and we used to lay on our backs and look up at them, and discuss about whether they was made or only just happened. Jim he allowed they was made, but I allowed they happened; I judged it would have took too long to make so many. Jim said the moon could a laid them; well, that looked kind of reasonable, so I didn’t say nothing against it, because I’ve seen a frog lay most as many, so of course it could be done. We used to watch the stars that fell, too, and see them streak down. Jim allowed they’d got spoiled and was hove out of the nest.”

It would appear that the earliest true humans were not capable of a conversation like that. Humans were behaviorally modern, as well as being moral, long before they were religious.

3 Likes

I’ll point WLC to this. Do you have any specific questions for him?

I like your table…

1 Like

Thanks, @swamidass. I imagine Dr. Craig may contest my claim that religion emerged only within the last 100,000 years, but if he has any questions, he’s welcome to read this section of my longer post over at The Skeptical Zone, which covers the evidence from places like Gough’s Cave (England), Bilzingsleben (Germany) and the Bruniquel Cave (France) and concludes that none of these sites provides good evidence for the existence of religion as such. In addition, the fact that there are no ritual human burials older than 92,000 years suggests that religion is relatively new.

What I’d like to ask Dr. Craig is this: in the light of the evidence presented, does he agree that the overall picture from anthropology and from the archaeological record appears to support gradualism, and that it now looks unlikely that there was a first generation of truly human beings? I’d also like to hear his thoughts on the emergence of language and ethical behavior, which I discuss here and here in my longer post. Cheers.

1 Like

@vjtorley

While being bald is not likely to have anything to do with metaphysical criteria like salvation or sanctification, if God has set these elements down as having some fixed and crucial values, then it is pretty obvious that only God is in a position to decide EXACTLY when a criterion is achieved or not.

One might interpret the fact there are no extant “close primate cousins” of humans in order to help human judgment embrace whatever criterion we find is important in understanding the creation of humanity.

@vjtorley this thread may be interesting to you, on precisely the same question: Was there a "first French speaker"?.

2 Likes

And rereading that thread, I don’t think that my thinking on “sharp” vs. “fuzzy” distinctions has evolved much since. The vast majority of the counterarguments on that thread missed the point. I’m not completely sure of my proposal either, and will withhold further judgment until I read up more on the controversy around essentialism.

What I would like to note, though, is similar to what Josh said: theology was formulated in a time when some form of essentialism was philosophically more influential. Thus, it easily talks about the distinction between humans and non-humans, and concludes that the origins of the former must have occurred “instantaneously”.

In contrast, as modern scientists we tend to prefer speaking in shades of gray - confidence intervals and probabilities, not binary true and false statements. We’ve also gathered a lot of new empirical data about “fuzzy” cases regarding many different objects that we thought could not exist, such as ring species and intelligent hominids. Thus, the idea of drawing clear, binary distinctions between things is less appealing. In fact, very few concepts that we use in day-to-day talk - such as “human” - can survive scrutiny once you apply a rigorous philosophical and scientific analysis to it. Unsurprisingly, scientists conclude that these concepts are just “vague”.

For example, even the word “scientist” is equally vague. What defines a scientist? Was there a “first scientist”? When was the moment when I transitioned from regular person to a scientist? Similarly, what defines “Christianity”, given the diversity of beliefs and practices of groups claiming that name? Is it just reduced to a social convention and means of self-identification?

At the same time, we have to understand why traditional theology, with its “simplistic”, “outdated” distinctions still holds sway in many communities: because it accords with a lot of common experience. Most people are not scientists who are aware of the fuzzy boundary cases. People who are serious about engaging in productive dialogue between science and theology should be aware of this “cultural” difference.

3 Likes

Hi @dga471 and @swamidass,

I’ve been having a quick look at the “first French speaker” thread. A few points:

  1. “French” is not a natural category, but a conventional one. Therefore, one would hardly expect to find a clear-cut boundary between French and Latin. Likewise, the term “scientist” is a conventional one: indeed, it wasn’t even used until 1834, when philosopher of science William Whewell first coined the term. (Before then, “natural philosopher” was the term in vogue.)

  2. “Human,” on the other hand, is supposed to denote a natural category, like “gold,” “water,” “living thing,” “oak tree,” “puffer-fish” and “chimpanzee.” In chemistry, the boundaries between different elements are very clear-cut: gold atoms have 79 protons, while mercury atoms have 80. Different molecular compounds can be easily identified by spectroscopic analysis. Again, the boundaries are sharp. Different species of organisms are reproductively isolated from one another (using the biological concept of a species): that is, they cannot inter-breed with other species and produce fertile offspring. And the differences between humans and chimpanzees are strikingly obvious: no-one would ever confuse the two.

  3. This invites the question: if, at any given point in time, the boundaries between species (the nearest thing to biological “natural kinds”) are so stark, then why shouldn’t we expect to find stark boundaries between modern organisms and their progenitors, if essentialism is true? The reply that there might still be a hard-and-fast ontological distinction between natural kind A and natural kind B, even if we cannot detect it empirically, completely misses the point. For it is precisely because we can recognize clear-cut distinctions in the natural world that we were driven to postulate essences in the first place. If there were continua and vague boundaries everywhere, there would be no reason to postulate essences.

  4. The feature that distinguishes human beings from other animals, according to that arch-essentialist, Aristotle, is not an anatomical feature, but a very general one: the capacity to reason. This capacity is what separates humans (who can grasp universal concepts which apply to natural kinds, as well as man-made general categories) from the beasts, who can grasp only particulars. Human knowledge is of a fundamentally different kind from animal knowledge: it’s like comparing 3-D with 2-D vision. The difference is huge. So yes, one should expect to find a “vast gulf” between the technical, scientific, artistic and linguistic capabilities of the most primitive humans and those of the most advanced apes or ape-men. And we do indeed find that “vast gulf,” if we examine only currently existing species. When we look back in the past, however, we find no sharp boundary, no matter which set of human capacities we look at.

  5. While Catholic theologians have tended to follow Aristotle’s definition of man as a rational animal, there have been Christian thinkers (e.g. John Wesley) who defined our humanity in terms of our capacity to know God or to know right from wrong. However, my investigations have shown that whether we look at moral behavior (altruistic care of the weak, self-sacrifice) or religious behavior (ritual burials, worship of supernatural beings), we find no sign of an overnight transformation in the hominin line, as we would expect to find if there was some point in history at which our ancestors knew God (or knew right from wrong) for the first time. On the contrary, we find signs of a slow, patchy transformation that took tens of thousands of years. What’s more, we find that humans were moral (in terms of the kindness they displayed to others in need and the sacrifices they made for the sake of others) hundreds of thousands of years before they showed any signs of believing in the supernatural: 500,000 B.P. (at the latest) for moral behavior vs. 100,000 B.P. (at the earliest) for religious behavior. This is definitely not what someone with an essentialist understanding of human nature would have predicted, and Christians should find it perplexing, and even troubling. Certainly I am perplexed.

@gbrooks9,

You write that “only God is in a position to decide EXACTLY when a criterion is achieved or not.” I think that’s only true when the criterion is purely a matter of “deeming” on God’s part - a matter of Divine fiat, if you like. However, “humanity” cannot be such a criterion. For part of being human is being able to defy the wishes of one’s Creator. And that ability cannot be “deemed” into existence by God: it’s either there in a creature or it’s not. At 500,000 B.C., for instance, hominins either were or were not capable of sinning against their Creator. Such a capacity, if it existed, belonged to these hominins by nature, not by fiat. Cheers.

In practice, that’s far from a sharp boundary. Even if we restrict ourself to extant populations, there exist all degrees of reproductive isolation between species or partial species, from weak selection against hybrids to failure of initial fertilization. The species category is yet another thing that lacks a sharp boundary. Of course, as you say, extinction of intermediate populations may make a boundary sharp. But even in that case, I don’t think the experiment your criterion would demand has ever been done with humans and chimps.

…to the degree that the capacity actually leaves preservable evidence behind, which leaves out most.

3 Likes

I think we need to distinguish between capacities, behaviour resulting from the exercise of those capacities, and the evidence that we have for that behaviour. Maybe it is possible that, at some point in the past, God endowed a group of hominids with the capacity to reason, or the capacity to have a relationship with God, and that this immediately produced a vast difference in their internal mental experience and made them full persons made in God’s image. Would this difference in internal mental life immediately have produced the kind of behaviour that would leave archeological evidence? I’m not entirely certain that it would. For example, maybe things like language and the technological and cultural advances enabled by our capacity to reason developed gradually, even though the capacity to reason was bestowed instantaneously. Or for another example, maybe those first image bearers began to pray to God, without engaging in ritual activities that leave any trace until later, and such archeologically-accessible ritual behaviour developed gradually even though the fundamental capacity to relate to God did not undergo any development.

2 Likes

There are no natural categories. There are only conventional categories and private categories.

That’s my unconventional opinion.

Hi @nwrickert,

There are no natural categories. There are only conventional categories and private categories.

Seriously? The category of “elements” is a purely conventional one? The distinction between the particles in the Standard Model is a conventional one? I can at least understand someone denying the existence of natural biological categories (but see the quote from Jerry Coyne below). However, when it comes to physics and chemistry, the denial of natural categories appears to fly in the face of overwhelming evidence that these categories are mind-independent.

Hi @John_Harshman,

In practice, that’s far from a sharp boundary.

You’re a phylogeneticist, so I’m going to have to accept your statement that the boundary between biological species is not a hard-and-fast one. However, evolutionary biologist Jerry Coyne, who is an expert on speciation, has this to say about the biological species concept (BSC) in a recent blog article:

The BSC is not really a definition, but, as I emphasize in my book Speciation written with Allen Orr—an attempt to encapsulate in words the palpable lumpiness in nature that we see before us. And nature, at least in sexually-reproducing species, really is lumpy: it’s not the continuum, or “great interconnected web”, that Taylor sees. In Chapter 1 of Speciation , I give three lines of evidence for the reality of species: they aren’t just artificial constructs, or subjective human divisions of a continuum, but real entities in nature. Yes, there is some blurring in both sexual and asexual organisms, but by and large species exist as “lumps” in the pudding of Nature. If this were not so, biologists would be wasting their time studying species, and field guides would be of no use. There is no blurring, for instance, between our species, chimpanzees, and orangutans, nor between starlings, hawks, and robins on my campus. And so it goes for most of nature. Some hybrids may be formed between species, but they are often sterile or inviable, and so don’t blur the boundaries between groups.

Would you agree with this assessment?

Finally, I accept your observation that many human capacities don’t leave preservable evidence behind, but those that do include some fairly major ones (ethics, language, art, religion), and they appear to support gradualism.

Hi @structureoftruth,

Maybe it is possible that, at some point in the past, God endowed a group of hominids with the capacity to reason, or the capacity to have a relationship with God, and that this immediately produced a vast difference in their internal mental experience and made them full persons made in God’s image. Would this difference in internal mental life immediately have produced the kind of behaviour that would leave archeological evidence? I’m not entirely certain that it would… Or for another example, maybe those first image bearers began to pray to God, without engaging in ritual activities that leave any trace until later, and such archeologically-accessible ritual behaviour developed gradually even though the fundamental capacity to relate to God did not undergo any development.

Well, I’ll certainly grant that you might be right. But the notion that our ancestors were praying (and presumably believing in an after-life as well) for some 400,000 years, without even once performing a ritual burial for dead family members, is pretty mind-boggling.

For example, maybe things like language and the technological and cultural advances enabled by our capacity to reason developed gradually, even though the capacity to reason was bestowed instantaneously.

If we look around the world today, we don’t find any primitive languages, even in technologically backward societies. All languages are roughly equal, in terms of their complexity. But we have good scientific evidence that human language didn’t appear overnight: it evolved gradually. You suggest that our human capacity to reason may have been fully developed, even at a time when human language was still evolving, but it is hard to see how advanced reasoning would be possible without a well-developed language. The two seem to be inter-twined.

I guess the point I’m trying to make is that while the archaeological evidence can (with difficulty) be reconciled with Christianity, it certainly doesn’t point that way. The gradualistic pattern we find in the archaeological record, even when it comes to language, art, ethics and religion, is the very last thing that Christian theologians a few decades ago would have predicted. And the fact that they all emerge at different times is even stranger.

In the main. But you must realize that I work on ducks, in which hybridization between species is much more common than in Jerry’s chosen taxon, Drosophila. There are many groups in which the species, though clear enough in the main, have a lesser degree of isolation than Jerry prefers. He makes a distinction between “species” and “good species”, the latter being the ones that form no fertile hybrids. There are huge numbers of species that don’t match the “good” criterion. Haldane’s rule, for example, relies on that.

I don’t think any of those other than art leaves significant evidence behind. Language ability (though not language) might leave genetic evidence if we can match genetics to phenotype adequately.

2 Likes

You are echoing Quine’s metaphysical desert.

20 posts were split to a new topic: Are there “natural kinds”?

Good ol’ vjtorley and his walls of text. I must say he taught me how to skim pretty well back when he posted regularly at UD.

So, I realize that Catholic hylomorphism might be a problem here, but supposing substance dualism and God at some point in history gives the first human at a mostly arbitrary point in physical development, but still meeting the minimum requirements, endows him with a spirit. Might that not be the first man and might that not have essentially nothing to do with your wall of text?

3 Likes

That is exactly what @jongarvey might say @BenKissling.

The problem with that is figuring out what affect having a spirit would have. Apparently none that’s noticeable by other people, including those with and without.

Or a special calling. But vocation isn’t visible in anatomy so that hardly a knock against it.

Is it having a spirit that produces a special calling and lacking one that prevents it? Otherwise, you’ve changed the subject.