The EricMH Information Argument and Simulation

@EricMH your simulation definitely deomstrates this is totally false. I’ll explain later. Keep in mind you had a hypothesis about what you’d see and that hypothesis was dsiproven, and you do not even know why. The quicker you stop digging the better.

DNA is information about something, so it is mutual information between the code and the environment. If maximizing entropy were the only goal, then generating DNA with a random process would be enough to make evolution happen. This is why entropy is not the sort of information we are talking about here. It is only half of the equation:

H(X) = I(X;Y) + H(X|Y)

If you are really hung up on the word “information,” then we’ll just call the quantity “blarb”, which is the mutual information between DNA and environment.

My claim is evolution cannot generate blarb.

I stated this was due to stochasticity.

If I’m right, and I_{LZ}(X:Y) = I_{LZ}(E(X):Y) = 0 in general, then on average I_{LZ}(X:Y) > I_{LZ}(E(X):Y) should fail half the time.

The other control is when X=Y. In this case, I_{LZ}(X:Y) > I_{LZ}(E(X):Y) should fail less than half the time.

I’ll test this and get back to you.

Go ahead and test that.

Turns out you are wrong. It’s because if the error I found in your proof. Your key equation is just false.

The test performs as expected. What is the error?

28 posts were split to a new topic: Computing Information Content in Biology

Wow. I’ll explain later.

Do you really not see what is wrong? I’m genuinely surprised. Perhaps it would have helped to have actually followed instructions and carefully explained what you expected for each experiment and why, and from the control, and had multiple well chosen controls.

You have a PhD right? You’ve done this before, right?

Can you state what is wrong?

Fine, I’m stupid, I don’t deserve the PhD, I’m sloppy, whatever. Just please state the error and this whole thing is over.

@EricMH I’m confused about this. I looked at the website you identify as your work. It is from an academic group that does not include anyone with your name as a current our past member. The page clarifies this has nothing to do with biological evolution. No peer reviewed paper for the study is identifiable by me.

Can you clarify? Where is the study? Was this your work? Are you using a psuedonym here?

This is not my proof. The proof that stochastic/algorithmic processes cannot generate algorithmic mutual information is due to Leonid Levin, a student of Kolmogorov’s. You can see it in section 1.2 in his paper “Randomness Conservation Inequalities; Information and Independence in Mathematical Theories.”

The only thing I’m doing is stating it applies to evolution, and throwing together some experiments for Dr. Swamidass to demonstrate it.

The website is not my work, it is the first link I found that defines “open ended evolution”.

1 Like

Okay great. Please point us to your published work on this. It was part of your dissertation, right?

How is this relevant? What I am arguing here is not related to my dissertation. My only point in that paragraph is to state I’m not a complete novice when it comes to information theory.

Just by you stating it applies to evolution, does make it so. You have powerful mathematical equations. Applying a mathematical equation to a real systems like biological system is no small task. Start with the very basics, - How much Information is in a single gene of a single genome? Make some assumptions, use standard codes.

1 Like

You don’t seem like an expert. You seem like you have an answer in mind that a relationship in Information Theory applies to evolution but you can’t get from A to B with it. How is the Information theory math related to biological systems. Explain it in a few words or sentences for all us here. We are certainly interested if you have truly made a breakthrough in applying information theory to evolution.

1 Like

The open ended evolution work at the Uni of Wyoming has nothing to do with biological systems or genomes. It is about the evolution of AI and programming. Where is the connection with biological evolution or real genomes?

1 Like

Of course it applies to evolution. Is evolution somehow existing in a math free universe?

And it is a trivial task. Regardless of the biological system’s complexity, it is finite and discrete, and that is all that is necessary to say algorithmic information theory applies to the system.

To argue otherwise entails biological systems must be capable of super Turing computations, equivalent to being a halting oracle. If you are right, then I’m not the only one making a breakthrough here :slight_smile:

How? Don’t see how the information content of a genome can be increased by mutations which really are just accumulated transmission errors going from one generation to the next.

1 Like

It depends how we define information. In information theory, which is what we are talking about, information has a precise definition. That is an increase information. If that is not true, than none of the equations we are discussing have any relevance here.