Is Mutual Information Functional Information?

Here is an application where string calculation would be interesting.


[quote] Proteins are actors in a complex system, with their functions to a large extent defined by their interactions with other proteins. It is the size, shape and chemical properties of the residues on the surface of the protein that dictate the capacity of a protein to interact with other proteins. The ability to predict the residues involved in these interactions would help to identify specific functionality, structural constraints and even disease-causing mutations.

In this paper we examine the capacity for mutual information (MI) methods to predict these contact residues between proteins by assessing their ability to predict contacts between two domains of a protein. To date, MI based methods have been extensively used to predict contacts within a protein (intra-protein)[/quote]

Equivocation alert. Mutational information is not functional information, nor is it a sign of intelligence. As one example, there are several processes (like common descent and neutral evolution) that produce mutual information without requiring intelligence.


In communication, mutual information is that shared between a sender and receiver. It does not says anything about the content of the message, only a measure of how much information is shared.

In statistics, mutual information is covariance, which is used to calculate Pearson correlation, and describes the strength of a linear relationship between two variables.

We could do with a primer on different types of information for those not inclined to dive into the theory. I keep meaning to write something like that anyway …

1 Like

Please do!

How do neutral mutations produce mutual information?

1 Like

If we are talking about proteins, then mutual information would be shared amino acid residues at the same position. This is rather easy to produce through neutral changes in aa sequence followed by speciation. The two species would share the same residue that was produced by a neutral mutation. In the case of biology, mutual information is strongly influenced by history, at least according to my understanding of the papers I have read.


For mutual information to be gained vs lost the probability of a gain must exceed a loss. How does neutral theory allow for a higher probability of gain of mutual information?

1 Like

The fixation of neutral mutations is nearly assured because the rate of mutation is higher than the probability of fixation.

1 Like

Mutations being fixed can cause a loss of mutual information. As you described a mutation causing a similar AA at the same spot between two proteins their can also be mutations causing a divergence of similarity on the same AA position.

The problem is the odds of the latter appear to be greater then the odds of the former and you are now subject to genetic drift causing loss of mutual information. I would guess the equilibrium would settle where around 5% of the AA’s would share a common position.

Yes, neutrla mutations can can cause both. You asked how neutral mutations could increase mutual information, and I described how they could.

I think we would all agree that negative selection of deleterious mutations is much better at preserving mutual information. Positive selection of new beneficial mutations would also decrease mutual information, so I’m not sure if mutual information is a good measure of functional information.

1 Like

I am not either sure how valuable mutual information is in this discussion. Eric seems to think so base on his latest post at UD. It has a lot of traction in Pubmed as a tool to analyze disease. I need to think about this more. Thanks for the very straight forward exchange.

1 Like

Mutual information is most likely a valuable tool in looking for the cause of disease. Mutual information can really be boiled down to sequence conservation, and the best candidates for causes of diseases are deleterious mutations in sequence that is conserved. However, I don’t see how this crosses over into determining the evolvability of proteins or in calculating functional information.

For functional information you need to know all possible combinations of amino acids that will produce a specific function. I don’t see how mutual information can provide that. Mutual information is heavily dependent on history. Once a protein with a specific function is found then that sequence is frozen in place and passed down through heredity. Evolution doesn’t start looking elsewhere for the same function, but instead modifies the existing function and optimizes the solution it has. Evolution also changes other proteins that are already in the genome, so any new functions will be heavily contingent on the proteins that are already in the genome.

1 Like