This looks very familiar, I think if I change just a few words the resemblance will be clear …
So now, we can visually see how the paradox is resolved. entropy of some sorts increases, but entropy of another sort decreases. Overall entropy increases, but local order can increase.
Which is how we describe energy entering an open system and increasing local order, and the cost of overall increased entropy (energy expended).
It seem that EricMH has been making the Information Theory equivalent of “The 2nd Law of Thermodynamics disproves evolution” argument. I suspected this was the case, but didn’t have enough pieces put together to say so. Well done, SJS!
@EricMH, I hope you are following this. Do you see that the entire area covered by the red circle is to be excluded? If not, I’ll try and make a more clear diagram for you. The key point I’m hoping you can see is this:
This ends up not being the correct way to compute FI.
I’m not seeing how this disproves what I’ve been saying. Your green crescent is the difference between two mutual information quantities. The law of information non-growth does not apply to a difference between two MI quantities. If Durston thinks it does, then he is mistaken. But, it is unclear how someone making a mistake is relevant to my argument. Apologies for my denseness.
From the top post, you define mutual information as the overlap.
And in the key part where you define functional information:
If we look at the overlaps that define that crescent, it is the overlap between the cancer circles minus the overlap between all the circles. Since the overlaps are MIs, the crescent is the difference between two MIs, which is equivalent to your conditional notation.
Sure, as I said, I do not disagree with you on that term increasing. Totally mathematically plausible. And if you want to call that term FI, then you are correct by definition. And if I want to call one of the non conditional MI terms FI, then I am correct by definition. The definitions don’t seem to be the interesting part of the debate here.
So, best effort to read between the lines: I think what you are saying is that a high entropy source creates the germline. Then, the divergence from mutations results in two different things that now have a mutual information. Furthermore, we subtract out some commonality, and now we have an increase in conditional MI.
The upshot is all we need is an initial large entropy source and we can generate as much conditional MI as we want through purely random + deterministic processes.
However, what this says to me is not that R+D = MI, but that you want to say conditional MI = MI, which is false.
You also want to say we cannot ever know whether we are looking at CMI or MI. However, as the math shows, CMI requires MI, so still false.
I stated that there are many types of MI. What you call CMI is not what you call MI, and explicitly make that clear. They are different types of MI. We can show that FI is found in CMI, and we can compute how much FI is found in cancer. We find quite a bit, and it arises by natural processes.
Therefore, direct demonstration that the FI argument against evolution fails, in several ways. (1) Durston computes in incorrectly, (2) FI is not the MI to which you refer, and can increase with natural processes, and (3) we compute the FI cancer to be very high.