Durston: Functional Information

That is all correct.

This is very idiosyncratic use of the term information gain and loss. I think this is why you are confused. However, as I agreed with you before, Delta H can be negative or positive.

The wikipedia page you reference does NOT define KL as the difference in entropy between to probability distributions. Why would you keep saying that when you can just look at the page and see that it is defined differently? It is defined, just as I explained before.

You can learn more about this from reading the page on KL divergence that uses the exact same definition in the wikipedia page you reference: Kullback–Leibler divergence - Wikipedia.

That calculation is in error.

KL is always positive. Information was required to move from a non cancerous to a cancerous state. You are making a very profound math error here.

The right way to think of this is different. The normal P53 state is well defined. We need to know how to change it into a carcinogenic state. There is a certain amount of bits required to specify how to change it into a carcinogenic state. That information is all functional information and it will always be positive because KL is always positive. If you used the formula that I just gave you, it would come up with a different number, a positive number.

Computing the precise amount of information in this case is another question. It turns out the mathematical framework you have can’t compute KL when the ground state isn’t IID MaxEnt. In this case, the ground state is normal P53, which is most certainly not IID MaxEnt, so that is why you are getting an aberrant result. You thought KL = delta H, but it does not. It never has except in the boundary case you stumbled into that has essentially no relevance to biology.