What is the ID Definition of Information?

It is does exist. There are several contradictory formulations.

1 Like

Heres their taxonomy of information:

1 Like

An example of incoherence is where they place semantic information in relation to DNA. That cannot be mathematically justified.


Sorry for any confusion. I am simply ignorant of what technical abilities exist for the forum software and wanted to make sure I was not missing any.

As was pointed out, it’s the math that matters. You seem to think there is some platonic concept of information that all of this math is trying to capture. But there isn’t. It’s the other way around. There are many mathematical approaches, each of which use the word ‘information’ in its own way.

If you want to see what sort of math the word ‘information’ has been applied to outside of the ID community, see the SEP article on information I linked earlier. There is also this book if you do not already have it:

The proposed Wiki will provide a summary of the math the various ID approaches use, I believe.

ETA: I think the same applies to the so-called “Law of Conservation of Information”. There are various mathematical results, like Levin’s, which have been called by that name, but they are mathematical only. There is a also well-defined concept conservation of quantum information in fundamental physics. But as for a law of conservation of biological information: as far as I can tell, there is no such concept that has any basis in science.


@mung I have an edited verison of this thread in mind for an article:

Mung's Primer on Information Theory :slight_smile:

So much WRONG in one small space. There’s an article in that for us.


You know I do, lol! :wink:

Like this?

The definition of semantic information is information that has meaning.

I mean, perhaps that is the definition of “semantic information,” but what is information that has no meaning?

Meaningless information is an oxymoron.

Well, in the spirit of Monty Python, one could argue that Shannon Information is nothing more than mathematics, and mathematics is syntax only, without semantics (unless one is a platonist, perhaps, but that kind of stuff is too abstract for me).

Or better, the semantics of a mathematical symbol is provided by the interpretation associated with its scientific usage. Until then, it is syntax only.


LOLOLOLOLOL!!! :smile: :rofl::joy::sweat_smile::star_struck::upside_down_face::frog:

You remind me of a paradox of Information Theory:

“What is the smallest number that cannot be described in less than 1000 bits?”

So when you find that number that needs at least 101 bits to describe it, the paradox kicks in, because the sentence above is 80 characters or ~640 bits. So the smallest number that cannot be described in less than 1000 bits, is already described with the 640 bit sentence.

Conveying meaning information always implies the sender and receiver have some previous agreement on how to interpret messages. When it comes to DNA and information, I think the only possible interpretation are the laws of chemistry and physics.


Huh? Not following this at all.

Shannon information is as much engineering as it is mathematics. Shannon did work for the telephone company.

Yes, Shannon information is often said to be “syntactic information”, and Shannon mostly avoided semantics. The basic principle: if you can encode your semantics in syntax, then Shannon’s theory can be the basis for transporting that syntax and thus supporting communication.

I don’t really agree that mathematics is syntax only. The idea of mathematics as syntax only, would be formalism. I’m not a formalist. But I’m also not a platonist (I’m a fictionalist), and I don’t see it syntax only.


It’s a coding/compression paradox. The description “simplest message that cannot be encoded in less than 1000 bits”, is itself a description of that message that is less than 1000 bits.

Looking for a reference to this paradox, I stumbled on a wonderful article about Information and Meaning:

Darnit, another great article 8 didn’t mean to find …


This is related to the proof that Komologrov Complexity is incomputable. Notice that the sentence simultaneously:

  1. Identifies a specific number in less than 1000 bits.
  2. Declares that this number cannot be described in less than 1000 bits

Both cannot be true at the same time.


I would say that the math of Shannon is separate from its application in communications.

But I do agree with your points about the philosophy of math allowing one to critique the view that math is only syntax. Philosophy of math is something I only know the basics of. Nothing new there for me, I suppose.

In any event, that’s why I included the caveat about Monty Python: it’s of course an allusion to their argument sketch where reasoned discussion devolves into argument by for its own sake using mere contradiction.


I get it. Thanks.

1 Like

No it doesn’t! :laughing:


@Dan_Eastwood, the Richard-Berry Paradox?


Yes, that’s it!

No it isn’t.

Yes it is!

No it isn’t!