The EricMH Information Argument and Simulation

As I said, I’ll engage later with you. Give it a couple weeks and ask me again when it seems things are slow. If you still think that’s hand waiving, you are not paying attention.

You seem sincere so I will give you a few hints. If you follow up on them it will help you immensely.

First of all, it is clear, from the specific words you use, a large amount of your exposure to information theory is through ID and Apologetics work. Start learning about the theory from other web sources, totally independent of questions of biology.

Second, look at The Role of Simulation in Science. Use some basic programming skills (I suggest Python), to test your intuituion and understanding of information theory. Part of your difficulty is that your intuitions are not calibrated correctly, and you are relying on semantic reasoning rather than math to guide you.

Third, if you don’t know how to program well enough to test your ideas, learn how to program. There are free tutorials online, and https://repl.it/ is an easy way to start testing simple snippets.

Fourth, search the BioLogos forum for what I’ve written on this on the past. Test what I am saying with simulation if you doubt me.

Lets see how far you get with that in a couple weeks. I’ll help you along from there.

1 Like

Eric Please let me know what you think of Claude Shannon, the father of Information Theory. have you studied his papers? Are you okay with the math?

1 Like

Yes, I’ve read his original paper and Cover and Thomas’ book on information theory. I am a software developer by trade, and have a PhD in Computer Engineering. Part of my dissertation was proving “open ended evolution” is impossible due to Chaitin’s Incompleteness Theorem. So, not completely ignorant in these matters :slight_smile:

The most important rule I referenced is the law of information non-growth. It is concisely proven in Leonid Levin’s paper, section 1.2. This refutes the possibility that naturalistic processes can increase mutual information.

You shouldn’t make so many assumptions about those whom you speak with. Most of my exposure to information theory comes from reading information theory text books, primarily Cover and Thomas, and Li and Vitanyi, and the papers of Leonid Levin.

Great, then you’ve done that part already. There are still the rest.

You’re not the only one with a family and day job :slight_smile:

1 Like

Well aware of Levin’s paper. Now how does it connect to biological systems and gene flow?

So it helps that you do have some education here. Seems like you know how to program, and I’m glad to hear you have done some work here. That is great. If you’d like, even though I do not have time to engage, you could articulate what you think the key information argument is against evolution. I might not be able to answer you right away, but when I have time I will try. You can bug me about it in a couple weeks.

Ok, I may be able to keep up with you. One of my first seminars when I joined Bell Labs was given by Claude Shannon. I may be a little rusty on it but I am up to date on quantum information theory as it relates to quantum computing, entanglement, and teleportation (not the Star Trek kind but the information teleportation kind). Also familiar with Hawkins information theory arguments related to black hole. So I think I can perhaps follow you as you apply information theory to biological entities and the propagation of information transfer through genes with mutations generation after generation.

I(x:y) is the algorithmic mutual information between x and y.

If X is some early simple biological organism and Y is a later complex biological organism, and evolution (E) is some combination of algorithmic processing and randomness injection, then I(X:Y) \geq I(E(X):Y). So, this says evolution doesn’t provide any help in turning X into Y. If X turns into Y, it is only because Y is the natural outcome of X, and evolution has nothing to do with it. In fact, the only thing evolution can do is prevent X from turning into Y.

2 Likes

So this is mix of well established notation and terms, with very clear meaning, and biological terms with no precise analogue in information theory. The core of your claim is the equation, which is (at this point) merely an unjustified and untested hypothesis. Several things missing from this argument:

  1. Demonstration that this equation is correct, either (or both) by proof and/or simulation: I(X:Y) \geq I(E(X):Y),

  2. Definition of “complex organism” and “simple organism” in precise information theoretic terms. This gets to one of the common fudge terms in most ID arguments “complexity.” You have define this with ruthless precision, and ensure it maps precisely to the information theoretic analogue you are claiming. failing this, it is not valid.

  3. Logical reasoning explanation why it matters if the equation is true: I(X:Y) \geq I(E(X):Y). If you are using the way it seems you are, it is a tautology followed by a sequitur, almost like saying “because 2 > 1 by definition, evolution is false.” Though you have not even laid your reasoning, so it is not clear.

  4. Related to the prior point, why would mutual information here?

  5. You defined evolution as “some combination of algorithmic processing and randomness injection.” This is an idiosyncratic definition that neither

  6. An explanation (and demonstration) of why algoirthmic mutual information is the right type of information tor this question.

  7. There is a critically important distinction between theoretic algorithmic information (which is uncomputable), and computable algorithmic information (which is always greater than the former). The two are very different, and algebraic manipulations using each one are different. Which one is being referenced here? It is common in ID arguments to use switch between these two definitions (without notice) in the proofs or application.

  8. If this is your own argument, fine. If you are trying communicate one of Dembski or Marks or Durston’s arguments (etc.), something is being lost in translation. It would be helpful to know if you think you are repeating an established argument, or putting forward one of your own.

At minimum, those question need to be answered to even make sense of what you are putting forward. You have a lot of work to do to even lay out your case, let alone make your case.

Finally, I’m fairly certain that I can produce a clear counter example that falsifies the equation, at least according to the rules of information theory.

1 Like

In evolution nothing turns into anything. Species evolve by populations acquiring mutations that are naturally selected. That’s it. I am still not getting the connection between mutual information and evolution via natural selection or even neutral evolution. I understand the math and agree with the mathematical conclusions but having difficulty applying the math to real biological systems containing information, I to start with and ending up with any new information from previous generations.

Can you define the information flow. What is the source of the information (X) and what is the receiver of the information (Y) and I(X,Y) is the mutual information between the two. I get this. But don’t know how you moving the information. How about take just one cell that mutates, what is the mutual information loss of the mutation?

1 Like

all that he is saying is that information is lost to the biological system by evolution. That is obvious as mutations represent low fidelity flow of information from one generation to the next.

Apply this to the most simplest example: one gene mutation of one cell to its daughter cell. Please tell me what you think the information exchange is for that example?

1 Like

Nothing he has written is clear enough to draw that conclusion. I also do not know that it is “obvious” that information is “lost” to the biological system. That does not make a lot of sense to me. My questions still stand.

Here is my example. A single mutation changes a single gene. Information is lost, not gained.

Umm. I’m not sure what definition you are using. In information theory, as would usually be understood (at least by me), information was gained by definition.

To some degree it depends on how you define the function. If you mutate your best friends phone number and the function is defined as calling your best friend Harry then mutation will always be a loss of information. If you define function as calling any friend then the mutation may be neutral as Harrys number gets substituted for Freds. The likely hood is improbable but possible. If you are looking for a girlfriend and your function is to find the most compatible relationship possible in this case a beneficial mutation to the number of the girls in your cell phone is possible but indeed improbable.

Genes can work with this same principal depending on their defined function.

1 Like

Entropy is not information!

What we call information is technically known as mutual information. Entropy is necessary for mutual information to exist, but high entropy does not entail mutual information actually exists.

Evolution can only cause mutual information to decrease. Likewise with the second law of thermodynamics.

That is why I don’t get the controversy over ID. Dembski is just repeating these well established information laws. The only discrepancy is he uses non-standard terminology.

What you could prove is there is a probability of generating mutual information from a random source. But, this probability drops off exponentially for each bit added. Besides this, nothing more can be proven.

In my argument I’m referring to what you call theoretic algorithmic information. My formulation may be novel due to its reliance on non-novel definitions of information, but in essence is no different than the standard ID argument.

Regarding the definition of complexity, I define it as the Kolmogorov minimal sufficient statistic (KMSS). Only compressible bitstrings have a non trivial KMSS, but most bitstrings are incompressible so their KMSS is trivial. Due to the law of information non growth (I(X:Y) \geq I(f(X):Y)) combining randomness and determinism doesn’t help generate KMSS any more than flipping a coin does.