The Semiotic Argument Against Naturalism

Honestly, I’ve no idea about the naturalistic likelihood of abiogenesis.

However, with regard to the evolution and development of increasingly complex life, I don’t see much traction in the argument. Mechanisms of genetic variation and selection can lead to ‘increases’ in genome size, functional differentiation and the acquisition of new functionality. For almost any definition/metric of ‘information’, whether specific to information science or even colloquially, examples of ‘information increase’ appear in the lab and have been found in nature. Additionally, comparative biology & genetics further suggest that organisms can be connect by relatively small or mechanistically feasible steps.

1 Like

Does Dembski actually say that DNA is “semantic information”? I know Meyer says this.

If one defines “semantic information” as ‘subjectively meaningful information that is conveyed syntactically (as string of phonemes or characters) and that is understood by a conscious agent,’ then clearly the information in DNA does not qualify as semantic. Indeed, unlike a written or spoken natural language, DNA does not convey “meaning” to a conscious agent.

Do you agree with that?

1 Like

I’m sorry you feel that way. I’m doing my best to keep up. As I wrote…

And I invited you:

You start to do this here.

That was my main point. I’m glad we are starting to get to the same page. Perhaps you could try and restate the argument in a way that (1) clarifies precisely what type of information you are using, and (2) where you are deviating from Lennox? Some terms and distinctions that might prove helpful to make your point clear:

  1. Are you arguing against evolution or abiogenesis?
  2. At every use of “information,” do you mean which of the following: semantic, functional, syntactic, or entropic information?
  3. At every use of “compressibility,” do you mean emperical compressibility (that is measured) or theoretical compressibility (that is not knowable)?

If you restate your argument now, perhaps we can make some headway.


You did seem argue for this based on incompressibility. Point #2:


Now, regarding Lennox, this is an important twist.

His book is unclear, but he insists he is not arguing against evolution. He is merely arguing against abiogenesis. So you are taking his argument in a way he never intended. At least that is what he told me.

If this is about abiogenesis, it might work, but we do not know what the first life looked life. How do we quantify the amount of information it had?


Please clarify which questions you are critical for me to answer. I’ve done my best to keep up. It was not my intention to pass over something critical. Also, please clarify what you mean by these questions. If you do not clarify what type of “information” you mean we are gonna be going in circles.

1 Like

That is exactly my point…

Where did @Jonathan_Burke get that quote from? Is there a reference?

1 Like

It’s here.

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.113.2289&rep=rep1&type=pdf

1 Like

Neither have I seen an instance like this.

Let’s try and hold of the appeals to authority (i.e. degrees) and ad hominems. Who is making the argument is really relevant. The key question rather is the content.

This was in response to @jongarvey and is actually central to my point. Science has limits. One such limit is that we do not know how to distinguish and measure types of information (semantic, functional, and noise) when they are mixed together.

No actually, I really do. I’ve spent enough time in academia reading Master’s dissertations and PhD theses to know that all are not equal. If I have to appoint someone in my company, I won’t be doing it on the basis of their MS, but based on their capabilities. I’m not saying yours is invalid, I’m just saying that it is not an argument.

Yeah I thought that too. But then, I’m quoting their arguments, not relying on their degrees.

My personal agenda is that I never use ID arguments. I thought that this was fun, because as Lennox puts it, if it works it is a positive proof against evolutionary theory (he tends to jump between that and abiogenesis.

I’m a bit more sympathetic in that case. Biologists use philosophy without realising it.

Ah, you don’t know me very well :slight_smile:

This is a bit uncharitable. I said your MS isn’t an argument. Focus on the arguments. Which you did (save the odd jab), so thank you for that.

Lennox does state this explicitly, actually.

Ok, I’m going to leave it at that.

Please do leave that rabbit trail alone.

The invitation still stands though…

And about your quotes, remember none of them are claiming “semantic information”, near as I can tell.

I’m going to do that as soon as I can.

1 Like

Yes you are right. He jumps between evolution, abiogenesis and naturalism. It’s not terribly clear in his text, but he does not claim this is a good argument against evolution. He thinks (it seems) that this is more about abiogenesis. Though I am loathe to appeal to a private conversation like this. His text is not really clear on this point, and you would not be the only one confused by his intent in that chapter.

To some extent, if you think it is a solid argument, you are going to have to lay it out even more clearly than Lennox did. Once again, the prominence on “compressibility” here is really the heart of my critique of your blog. So if you are backing away from that, then perhaps I’ve made my point.

Remember, ultimately, we agree. I do not think naturalism is true. I just think there are so many good arguments against it, that holding to bad arguments just weakens the case.

Having read some number of proofs for or against some number of metaphysical propositions, I’m a bit skeptical about things titled along the lines, ‘The “X” argument against (Naturalism/God)’. OK perhaps as apologetics, but as definitive proofs, they’ve had a history of falling short. [Perhaps it’s another point in favor of my ‘Ironic Designer’ hypothesis :grin:]

1 Like

First penned by Pascal of course.

Ok, so let’s see if I can piece this together:

1. Biological systems are information systems
This would only be interesting if the kind of information biological systems contain are not just random noise, but meaningful in some way. You seem to say that this meaningfulness needs to be subjective/depends on humans. However, as I’ve come across it, it has to do with the specific arrangement of the letters in the words. They call it semantic information, which you take to mean subjective. But as far as I can tell, this need not be the case. To quote the SEP on semantic conceptions of information:

How I understand this is that semantic content does not need to be human readable to contain information, just like computer programs don’t need to be human-readable to execute instructions. The readability is for us to interact with them. But assembly or code-golfing languages still contain this kind of semantic content.

You mention that DNA does not contain semantic information. How would you describe the specificity of the orderliness of the arrangements of ‘letters’ in DNA strings?

How would you define the difference between information and noise? Granted, in MTC there seems to be no difference, as noise is maximally entropic (even if only aleatorically, as you say). However, consider the string ~.@q: in the language J. This is a command (it calculates the unique prime factors of a number). Does it contain more or less information (in any sense) than, say, :q@.~ (its inverse, which probably produces nothing)? I’m guessing you would so ‘no’, while Lennox and other would say ‘yes’?

2. Algorithms that produce incompressible pieces of information have to themselves be more complex, or receive a more complex input of information, than that which they produce, and therefore do not produce new information.

(I swopped the premises around, hopefully it isn’t cheating, but it makes more sense to me this way)

The information here refers to semantic content, I guess. Otherwise a whole bunch of experts missed the obvious noise-algorithm thing.

Peter Medawar:

Or Leonard Brillouin:

And then Bernd-Olaf Kuppers:

This is also where Gregory Chaitin’s work on algorithms compressing semantic information is invoked.

3. The information in the DNA molecule is algorithmically incompressible
One could use a reference genome, but you have an information-rich algorithm, which contradicts 2.

Here I asked you what algorithm you would use to compress “This sentence contains semantic information”. I am really curious. I know that zip won’t work, but what will? What would it look like?

4. Such information-producing algorithms are not present in nature

I think this premise is redundant.

5. Therefore the algorithm of evolution by natural selection (or any other unguided process) cannot produce any new information, including that contained in the DNA molecule.

@Jonathan_Burke, I really appreciate you taking the time with your criticism, thank you. If you think I misunderstood any of it in the summary above, please point it out. I tried to straighten out my confusion of semantic and syntactic information, and how you use ‘entropy’ and ‘information’ in your field. Are there any bio-informatics textbooks that you (or anyone else) would recommend? I’m reasonably comfortable with statistical mathematics from my academic work, but more in the line of Bayes than MTC. I have some other spare-time academic reading to do for the next while, but will put it on my list.

This was my written-in-haste version of it. I wish I had more time to think about it, but I’m going away for 10 days with family, but will read it again when I get back.

The first question I’m going to ask you before anything else is exactly what you think this has to do with evolution. What’s the conclusion we’re supposed to arrive at as a result of all this?

Thanks for giving this another shot, but this is not clear. You to clarify to which type of information and compressibility you refer.

In reference to information, do you mean which of the following: semantic, functional, syntactic, or entropic information? Or something else?

It seems you mean “semantic information”. Is that correct.

Please specify here…Do you mean empirical compressibility (that is measured) or theoretical compressibility (that is not knowable)?


Your argument is not fully clarified, but we already know that semantic information is compressible. We already know, also that both empirical and theoretical compressibility does not measure semantic information content.

I’m demonstrated this to you with my with 3 strings. It is unknown (and unknowable without my revealing it) how much semantic information are in them, even though the empirical compression increases. Moreover, compression only gives an upper bound to information content. If you use the wrong algorithm, it will overestimate the compression size.

To understand my critique, focus on the 3 strings. If you were right, it should be easy to compute, using compression, how much semantic information is in them. This turns out to be provably impossible.

1 Like

Some assorted notes. There is some confusion with terminology and questions about system definitions.

A) ‘Information’ is a relative, not absolute measure. Further, it is always in reference to some context. For example, I can isolate and put in a bottle 1 kilobase stretches of DNA from E. coli, human and random sequences. In the context of a bottle, they have the same ‘information’. We can measure their thermodynamic (e.g. entropic) equivalence thermodynamically, perhaps in a bomb calorimeter. Context matters and a full description of any system must include the organism plus its environment.

B) No ‘information’ said to be contained in a particular entity can be assessed in the absence of the environment or context in which it exists or was transferred. For species, each generation retains some record of the influence of subsequent environments. This is information flow from the environment to the organisms.

C) We speak of DNA programs and such as ‘algorithms’. But what are the actual algorithms being considered with regard to abiogenesis or evolution? Is it the DNA sequence? Is it metabolism or the regulatory networks? Is it variation and selection in the context of a replicating system? What do we know about the capabilities of a particular algorithm? It seems to me this is one part that needs to be specified before we can assess anything.

D) From an evolutionary viewpoint and in the context of living organisms, I see how variation and selection can support the transfer information from the environment into the genome and basic biology of organisms. When people ask “where life gets the information to adapt and even increase in complexity” many biologists think, “the environment”. Bear in mind, the environment at the surface of a planet is not a wildly random, unordered, amorphous thing. It’s not a gaseous plasma. It is not the same as a random number generator. It’s actually quite structured, thanks to the basic laws of physics and chemistry. It has niches, gradients of many types (i.e. temporal, spatial) and transitions between local environments. For living organisms, the environment(s) provides the baseline for which variants are tested, leading to retention of information about the environment(s). This is a tremendous source of complexity. Additionally, the presence of other organisms must be considered part of the environment. Comparatively, I suspect the amount of ‘information’ or ‘complexity’ in the environment dwarfs the amount transferred and retained in organisms. This is certainly true thermodynamically and so I would hesitate to assume this doesn’t apply in a information theoretic sense as well. Some years ago I wrote: “If you’ve got enough spare energy to play Nintendo, you’ve got enough energy to evolve”.

2 Likes

@Argon I"m in agreement with you, I think. On several points.

We cannot really deal with information in any system (let alone biological systems) without getting into all these details: providing context, discussing actual algorithms, specifying exactly what we are trying to measure and how, in addition to bounding our estimates. That is exactly how we are supposed to apply information theory.

However, keep in mind that we are responding to an argument that attempts to ignore all this, and reduce a poorly defined version “information” down to just the number of bits compression program outputs on DNA, and claim that nature cannot produce this magical type of information. I say “magical” because it is simultaneously not found in nature, and can be accurately measured by generic compression programs, without any knowledge of the system itself. The argument is, essentially, we can ignore all the details of the system and make confident statements about it.

That detail free approach does not work. If the goal was to understand if the proposed mechanisms of evolution or abiogenesis could generate the information we do measure in DNA, we would do things differently. We would start to model those proposed mechanisms, finding ways to instrument these models with predictions of what we could observe. And we would then test them.

However, the focus in this thread is this specific compression argument. Not the larger, and more interesting question, you are getting at here. Perhaps we need a new thread.

1 Like

But I wasn’t using it as an argument. I wasn’t saying “I have a Masters degree, therefore you are wrong”.

You were very obviously relying on their degrees, which is why you kept citing their degrees and awards.

But you started this entire thread by using an ID argument.

No amount of philosophy can overturn scientific facts. As soon as people try to use philosophy to overturn something like evolution, it’s clear they are not doing science, and they are avoiding the science because they can’t disprove it. This is a form of intellectual dishonesty.

I didn’t see that in anything you quoted from him, but perhaps he has said it elsewhere.

Yes. That’s what I see as a ‘magic bullet’ approach, the idea that there is a critical weakness in a theory that can nullify it entirely. Perhaps such a weakness exists but it hasn’t been demonstrated yet.

There are other, critical issues with some of the stated propositions about information transfer that don’t seem addressed. However if you’d like to stick with the “compression” argument in this thread, that’s no problem. Just one last comment: Some of what I’m seeing seems related to the ‘no free lunch’ brouhaha of Dembski and Marks several years ago.

1 Like

Can you clarify what you as specifically pointing to, perhaps one at a time, on this thread or another as you see fit?