The EricMH Information Argument and Simulation

Exactly.

Not according to Shannon. :smile:. Why not try and test it with a simulation? It will demonstrate I am correct here. Right?

Um no. That is not the counter example. You have a lot of questions to work through. Looking forward to seeing your answers. Until then you haven’t even made a clear argument.

Entropy is a measure of uncertainty, the opposite of information.

What is the counter example?

Still waiting on answers to the questions above.

Working on it. You gave me many more questions than I gave you. We should establish some sort of question time reciprocity. If I spend T time answering your questions, you spend the same answering mine. Otherwise, you can just throw out an arbitrary number of questions as a prerequisite to answer any of mine, and so progress is never made.

If we did that, I would be far too long before I’d have to answer any of your questions. I’ve already engaged with you quite a bit answering question. You are trying to make a case. It is not clear. I’m telling you what I need to know to just understand what you are getting at. You don’t have to answer if you don’t care about making your case.

We are both making cases, and my questions to you are basic. I’m not expecting drawn out proofs. You say something other than chance and necessity can generate information, which is also not a halting oracle. What is it?

What do you mean by “information”? Entropy is not information. I’ve provided a clear definition of information as “mutual information”.

Well as I teach it at a leading science university, it entropy is precisely information, just as Shannon explained in his seminal work, and has been demonstrated in simulation after simulation. Mutual information is merely shared entropy. This is demonstrable with simulation, math, and included in every standard textbook (though entropy goes by many names). The formulas are literally identical.

So what ever you mean by information theory, it is something different than we teach students at washington university. You’ll have to put out the formulas and show your derivations.

1 Like

One more thing. I’m not making a case. I am just stating a hard cold fact:

I am an expert in information theory, whose scientific work applies information theory to solves problems in biology, chemistry, and medicine, and training PhD students in how to do this too. From that vantage point, none of the information theory arguments against evolution make any sense to me. Most seem to be in clear mathematical error.

Maybe I am wrong, but that is not an argument. That is a just a brute fact. If you want to change it, you have to make your case. I’ll look at it, but I have no expectation or need to convince you that you are wrong. So I am not making a “case”.

1 Like

When people talk about information in DNA, what they are referring to is DNA that is functional in some environment. There is a correlation between the DNA and environment, and that is mutual information. Entropy must be high for there to be mutual information, but high entropy alone does not entail mutual information between DNA and environment.

I think people are talking past each other. Just for clarification about ‘mutual information’ - From Scholarpedia:

Mutual information is one of many quantities that measures how much one random variables tells us about another. It is a dimensionless quantity with (generally) units of bits, and can be thought of as the reduction in uncertainty about one random variable given knowledge of another. High mutual information indicates a large reduction in uncertainty; low mutual information indicates a small reduction; and zero mutual information between two random variables means the variables are independent.

I’m not sure how to connect this with ‘information non-growth’ to tell us about changes in the information (functional) content of an organism, but I guess that’s part of a proof.

1 Like

That turn out to be not true. Yes, that is what you mean, and what most ID proponents meant. There is, however, no settled information theory concept of “function.” Usually when we discuss information in DNA, all we mean is entropy.

Yup that’s right. I want to know what he means by all his terms, because he uses them in very different ways than do I. Your definition too is consistent with information = entropy and mutual information = shared entropy. Notice in that definition that “randomness” and “entropy” and “information” are semantically equivalent.

Well, not my definition. I’d rephrase ‘mutual information’, probably imprecisely, as similar to the correlation between numerous variables.

For sure, ‘functional’ information is a difficult thing to describe.

Regardless of whether there is a precise definition of function, it is clear that randomly generating DNA by flipping the coin is not going to create organisms. So, entropy is an insufficient definition of the information in DNA. Otherwise, a randomly generated string of DNA has just as much information as a functional string of DNA, which is clearly false.

But, let’s leave aside the question of function. Let’s just talk about complex structure. This can be precisely defined as the Kolmogorov minimal sufficient statistic, and it cannot be generated by random and stochastic processes.

E(X) is some program E applied to X, so is the same as U(E.X), where U(p) is executing the program p on universal Turing machine U.

So the algorithmic mutual information that E can generate from X regarding Y is I(E,X:Y).

If I(E:Y)=0 then since I(X:Y) \leq I(E,X:Y) \leq I(X:Y) + I(E:Y), this means I(E,X:Y) = I(X:Y).

If I(E:Y)>0 then the issue is not solved, we now need some source of I(E:Y), which also cannot be algorithmic/stochastic, otherwise the problem gets pushed back one more level.

I define complexity by the Kolmogorov minimal sufficient statistic. If it is high, then the organism is complex. If it is low, then the organism is simple.

The equation means evolution cannot turn X into Y, since in that case the mutual information between X and Y is maximal.

There are two possibilities: information is entropy or information is mutual information.

If we use entropy, then a random bitstring contains just as much information as a bitstring with the same distribution but also highly ordered, which is clearly incorrect.

If we use mutual information, then a random bitstring will contain minimal mutual information regarding any other bitstring but itself. However, an orderly bitstring can contain mutual information regarding structured bitstrings.

“Evolution” is a vague word. It might mean Darwinian evolution. It might mean the evolution posited the early Greek philosophers. It could mean Wallace’s guided evolution. Some forms of evolution are less problematic, such as Lamarkianism, animal breeding or genetic engineering, since they involve the interaction of a directing agent. Others are problematic, because they claim complex organisms come about without any kind of directedness. I am arguing the latter sort of evolution is making a mathematical claim that is false.

This latter sort of evolution is generally characterized as being some combination of algorithmic and stochastic processing. It is proven that no amount of algorithmic and stochastic processing can increase mutual information. Thus, the latter sort of evolution is incapable of generating complex structure.

It is asymptotically equivalent to Shannon’s variety of mutual information, and can be reasoned about more precisely.

I am using theoretic algorithmic information. The fact it is not computable is the problem that evolution faces.

This is my own argument.

If maximum entropy = maximum information, the heat death is the most informative of all. Time to find a new theory of information (but too late by then).

A question I’ve asked before at the Hump (and maybe even here). Everything we do here, and nearly everything every one of us does in our daily physical round, is based on information in some functional, rather than Shannon, sense.

Yet it evades scientific definition (as it does in the equally self-evident information-rich area of living beings).

Why would that be? There’s nothing wrong with the information, but it seems invisible to the science. Curious.

Can you describe what this means?

1 Like

Great. That helps immensely. I see the error. So what simulations have you done to check your argument? Can you design one? I certainly can.

What’s wrong with mutual information?

Evolution contributes to what Y is generated by E(X).