Computing the Functional Information in Cancer

I’ll turn your nice pictures into some ugly symbols.

That little green crescent is MI(C1:C2)-MI(C1:C2:G). You are saying that possibly,
MI(F1,C1:F2,C2) - MI(F1,C1:F2,C2:G) \geq MI(F1(C1):F2(C2)-MI(F1(C1):F2(C2):G)

This is true. I do not disagree here.

1 Like

I don’t understand your notation but it appears to be referring to wrong segment.

Typo alert:

11 x 31.5 bits = 346.5 bits of functional information in colon cancer
6 x 31.5 bits = 189 bits of functional information in colon cancer

Looks like one of those ought be Lung cancer.

1 Like

This looks very familiar, I think if I change just a few words the resemblance will be clear …

So now, we can visually see how the paradox is resolved. entropy of some sorts increases, but entropy of another sort decreases. Overall entropy increases, but local order can increase.

Which is how we describe energy entering an open system and increasing local order, and the cost of overall increased entropy (energy expended).

It seem that EricMH has been making the Information Theory equivalent of “The 2nd Law of Thermodynamics disproves evolution” argument. I suspected this was the case, but didn’t have enough pieces put together to say so. Well done, SJS!


@EricMH, I hope you are following this. Do you see that the entire area covered by the red circle is to be excluded? If not, I’ll try and make a more clear diagram for you. The key point I’m hoping you can see is this:

This ends up not being the correct way to compute FI.

Do you see why?

I’m not seeing how this disproves what I’ve been saying. Your green crescent is the difference between two mutual information quantities. The law of information non-growth does not apply to a difference between two MI quantities. If Durston thinks it does, then he is mistaken. But, it is unclear how someone making a mistake is relevant to my argument. Apologies for my denseness.

1 Like

No, that is not true. You appear to have misread it. I’ll have to think through how to make that clear in the figure.

Spell it out symbolically as I have done. That is unambiguous.

We already did, with set notation.

Ok, then perhaps you can clarify how that definition is different than mine. As far as I can tell, they are the same thing. Both definitions subtract the germline from the cancers’ overlap.

In what way is the difference between two mutual information quantities? This does not appear to be the case.

From the top post, you define mutual information as the overlap.

And in the key part where you define functional information:

If we look at the overlaps that define that crescent, it is the overlap between the cancer circles minus the overlap between all the circles. Since the overlaps are MIs, the crescent is the difference between two MIs, which is equivalent to your conditional notation.

I(C1;C2|G) = I(C1;C2) - I(C1;C2;G)


Great. We are on the same page.

So it appears that we agree that this term always decreases:


However, you agree that this term can increase:

I(C1;C2 | G)

Notice that this is where all the functional information is, because we have selected genomes based on a new function. I also computed how much FI is necessary. It is a lot of FI.

This appears to demonstrate from evidence that FI can increase without input of intelligence, unless you intend to argue that cancer is gaining this FI through intelligent input from God.

Sure, as I said, I do not disagree with you on that term increasing. Totally mathematically plausible. And if you want to call that term FI, then you are correct by definition. And if I want to call one of the non conditional MI terms FI, then I am correct by definition. The definitions don’t seem to be the interesting part of the debate here.

I’m not calling it FI by definition. I’m calling it FI by demonstration. See above.

If that is the case, then FI can increase by natural processes.

So, best effort to read between the lines: I think what you are saying is that a high entropy source creates the germline. Then, the divergence from mutations results in two different things that now have a mutual information. Furthermore, we subtract out some commonality, and now we have an increase in conditional MI.

The upshot is all we need is an initial large entropy source and we can generate as much conditional MI as we want through purely random + deterministic processes.

However, what this says to me is not that R+D = MI, but that you want to say conditional MI = MI, which is false.

You also want to say we cannot ever know whether we are looking at CMI or MI. However, as the math shows, CMI requires MI, so still false.


I stated that there are many types of MI. What you call CMI is not what you call MI, and explicitly make that clear. They are different types of MI. We can show that FI is found in CMI, and we can compute how much FI is found in cancer. We find quite a bit, and it arises by natural processes.

Therefore, direct demonstration that the FI argument against evolution fails, in several ways. (1) Durston computes in incorrectly, (2) FI is not the MI to which you refer, and can increase with natural processes, and (3) we compute the FI cancer to be very high.

CMI requires MI.

Apparently not. Look at cancer. MI reduces. CMI increases. FI increases. Whatever you are arguing about MI, FI easily increases.

1 Like

Hence, initial MI is needed for CMI to exist.