What is the ID Definition of Information?

Would you go so far as to say that what Dembski calls information is “lack-of-information” is entropy is uncertainty to the rest of the mathematical world?

In all sincerity, I have no idea what Dembski was thinking with this 2005 definition. It’s not information, it’s an expected value (which is the criticism in my blog post). Dembski considers it to be a probability, but it can be GREATER THAN ONE, and Dembski uses it this way in an example.

Given these oddities, CSI = WTF. I couldn’t believe Dembski would write something that wrong, and I spent a year following up on references and reading up on information theory to make sure I have misunderstood him somehow. Other authors seem to have missed this problem, possibly because it is fixable. It’s very difficult to understand why Dembski never fixed this himself.

Be careful what you ask for!

Elsberry, W., & Shallit, J. (2011). Information theory, evolutionary computation, and Dembski’s “complex specified information”. Synthese , 178 (2), 237-270.
https://onlinelibrary.wiley.com/doi/epdf/10.1111/zygo.12059

Devine, S. (2014). An algorithmic information theory challenge to intelligent design. Zygon® , 49 (1), 42-65.

Rosenhouse, J. (2016). On mathematical anti-evolutionism. Science & Education , 25 (1-2), 95-114.

(also search for “Evolutionblog”)

editing in progress - mostly done but I have to step away …

2 Likes

I’d be happy to help. But I suspect I am qualified only as a reviewer, not an originator, since I have no experience in these fields. These are just topics I find interesting to read about as a retirement activity,

It will be group effort, where we expect there to be a joint effort. I am an expert in this area, and we have the originators of these arguments frequenting these forums. So I expect we have one of the few corners of the internet where we can produce a well-informed and self-correcting wiki for this question that has a shot at being fair to ID proponents while also explaining how mainstream scientists find deficiencies in their case.

I’d like to do this, but I need to be careful about what I commit to, as I have a bad tendency to start more projects than I can finish. :wink:

I have a lot of links and resources collected, at the very least, starting with the above.

Thanks Dan. No rush – there are a lot of citations in just those few papers you provided…

The Rosenhouse paper was similar to what I had in mind, but briefer and with more math. The wiki idea sound good.

1 Like

@BruceS and @Dan_Eastwood, can you confirm you can edit this wiki page? The ID Arguments From Information Theory

Let’s see what you can out together with a little effort. All Level 3 should be able to edit up there.

@glipsnort and @John_Harshman would you be willing to summarize the genetice evidence pertaining to human origins in another wiki?

If we can start building strong reference material this way, I will make this far more prominent in the site. We are one of the few places that has expertise to do this right, in a space trusted byevolution skeptics. I’d hope we could make more summaries of salient topics, neutrally explained, outside the debate back and forth of conversation.

4 Likes

I can edit/post, but the topic is closed, so that might be my mod privileges letting me do that?.

You are supposed to edit the main post. Not add new posts. Perhaps start by making an outline, and adding key references.

OK. I’ll move my discussion questions back here.

OK.

Do you refer to the evidence for common descent of humans and other apes?

I’d say the two key classes are:

Genetic evidence of common descent of man.

Genetic evidence and historical population size.

@BruceS I’d like to start with your summary of the types of information about, and add CSI. Dembski has more than one definition of CSI, and we need to make this clear. The Elsberry 2011 paper is a great resource.

This topic has a lot to draw on too:

1 Like

I added a line under Dan’s. I hope that is what you were looking for.

Is there a set of tags intended for wiki’s based on the forum software?

It works exactly the same as the posts here. I’d just dive in and get the content there. Let’s see what you can put together and get organized. We’ll figure out the right style guide when you make some progress. If you feel the need to rebel against this approach, and want a real wiki, I might be able to set that up for you. This is still a good starting point.

My library of ID books is not inconsequential. Imagine that. :slight_smile:

So if anyone would like me to look something up…

…there are important distinctions to be made when talking about information in DNA. In the first place it’s important to distinguish information defined as “a piece of knowledge known by a person” from information defined as “a sequence of characters or arrangements of something that produces a specific effect.” Whereas the first of these two definitions of information doesn’t apply to DNA, the second does. But it is also necessary to distinguish Shannon information from information that performs a function or conveys a meaning. We must distinguish sequences that are (a) merely improbable from sequences that are (b) improbable and also specifically arranged so as to perform a function. That is, we must distinguish information carrying capacity from functional information.

  • Meyer, Stephen C. Signature in the Cell. pp. 91-92

Am I the only one that thinks we need to document the various other definitions of information that don’t issue from ID literature and that we need to distinguish between definitions of information and proposed measures of information?

Did Shannon provide a definition of what information is?

Yes he did. The problem @Mung is that there are several linguistic glosses over the math that have conflicting semantic meaning. The math however is constant. That indeterminacy of meaning allows grand equivocations that are very hard to untangle if all you are working with is words. The precise formulas being used are what clarifies what they heck they are talking about. And it also clarifies the errors being made.

So yes, that is Meyers’ definition of information. It turns out to be incoherent and self-contradictory with how he actually applies it.

3 Likes

As one example, from another thread, and Id proponent though that:

Log (N / W) = FI = KL distance = Delta H.

  1. He tries to estimate FI is defined with delta H (which is an error).

  2. Several times over a year, I’d asked him, so how do you interpret it when you get a negative number (delta H), trying to clarify that this is an error.

  3. He responded that KL is never negative, and that I was uninformed about information theory, to which I pointed him to the definition (by my count, about 5 times). It was only when I wrote out the definitions, and pointed to the errors, he acknowledged that delta H can be negative.

  4. Now, about 2 years into dialogue, we can finally return to my original question. How does he interpret a negative FI? This is a totally incoherent concept, which demonstrates why delta H is not the correct equation. It works in a narrow case, only when several (false) assumptions are made.

Critical here is the math, and the precise equations being used, and the precise implementations of programs being used to validate claims. If you focus on verbal definitions, it will always be confusing. If you struggle with math, this will always be confusing.

1 Like

I don’t see Meyer saying that this or that is “the ID definition of information.” I interpret that as Meyer pointing out that we need to be sensitive to what exactly we are talking about. It could change depending on the context.

Perhaps searching for “the ID definition of information” is doomed to failure because perhaps it does not exist.