Contradictory Points in ID and Information Theory Arguments?

On exactly this issue, I have a question about Eric’s use of Levin’s results on Kolgomorov mutual information. I believe he continues to see this result as a key part of his arguments.

Early in the discussion. Eric said that the two things of interest were the DNA code and the environment. It seems to me that these are indeed the two entities of interest if we want to talk about evolution and not just mutation of genomes.

I recognize that mutual information with the environment is difficult to define. That diffculty can lead to introducing proxies like functional information in the genome and then arguing that this is still mutual information where Levin’s result applies. But that approach has been hashed out from many viewpoints in the discussions. It is not what I want to ask about.

Instead, my question is this. As I understand it, Levin’s result says that if I(X, Y) is the Kolmogorov mutual information between X and Y and E(X) is some function of X comprising only “algorithmic processing and randomness” transformations, then I(E(X), Y)<=I(X, Y).

Now Eric has said that X is the genetic code and Y is the environment. Then E(X) is taking to refer to the process of evolution.

My question is why E(X) is limited to be a function of only X. It seems that any evolutionary process must involve the environment Y as well. So we should be considering functions E(X, Y). Does Levin’s result still apply if we do so?

5 Likes