Functional information is a measure of the functional complexity of a system. From the Hazen Szostak abstract:
Complex emergent systems of many interacting components, including complex biological systems, have the potential to perform quantifiable functions. Accordingly, we define “functional information,” I ( Ex ), as a measure of system complexity. For a given system and function, x (e.g., a folded RNA sequence that binds to GTP), and degree of function, Ex (e.g., the RNA–GTP binding energy), I ( Ex ) = −log2[ F ( E x)], where F ( Ex ) is the fraction of all possible configurations of the system that possess a degree of function ≥ Ex
We never answered that. To know how much functional information is there, you’d need to have (1) a scale of “function” and a measurement of how much of this function that sequence has, and (2) similar measurements for all other sequences of that length, or at least a large sample of them. Then one could compute the fraction of all sequences P that have this much function or more, and the functional information would be -log2(P)
I sense that T_aquaticus meant that comment as an instant refutation of the very concept, but with that context, it is meaningful and you can compute FI.
Complex emergent systems of many interacting components, including complex biological systems, have the potential to perform quantifiable functions. Accordingly, we define “functional information,” I ( Ex ), as a measure of system complexity. For a given system and function, x (e.g., a folded RNA sequence that binds to GTP), and degree of function, Ex (e.g., the RNA–GTP binding energy), I ( Ex ) = −log2[ F ( E x)], where F ( Ex ) is the fraction of all possible configurations of the system that possess a degree of function ≥ Ex . Functional information, which we illustrate with letter sequences, artificial life, and biopolymers, thus represents the probability that an arbitrary configuration of a system will achieve a specific function to a specified degree. In each case we observe evidence for several distinct solutions with different maximum degrees of function, features that lead to steps in plots of information versus degree of function.
The equation Joe cites above -log2(p) where p is the fraction of functioning sequences in the system converts the fraction into information in Bits. In Szostak’s first paper on the subject he claimed the functional information to bind an RNA strand of 70 nucleotides to ATP was a fraction of 1/10^11 or 37 bits.
By measuring the amount of information before and after some change, and noting an increase.
But that requires a way to measure information. You never have and never will show us how you measure the amount of information in any organism, so it’s impossible to determine whether information is increasing, decreasing or constant.
I will assume you mean functional information when you say information.
When you lose a limb for walking/running and gain a limb for swimming you have simply traded one function for another. As far as functional information that would depend on the genes and regulatory elements required to build each function.
If for instance it took 100 more genes to build a fin for swimming then a pair of legs for walking and running I think you could safely say you gained functional information in the transition. 100 genes would conservatively translate to around 50000 bits of functional information given Szostak’s (2003) estimate of 140 bits for a 70-mer nucleotide element with a specific function.
What I see missing from all this is the environment. It is surely a piece of information that you are now in an element dominated by buoyancy and hydrodynamics. ID seems to start with a notion that as found animals are designed to be platonically optimized for their environment and all they have to do is inhabit it. The environment imparts no information of its own and plays no role in shaping them. This ignores a pretty big piece of the picture.
You need to read the paper more carefully. Functional information is not just about a single function such as an enzyme it includes functional systems.
From the paper:
Systems may be complex in terms of information content, physical structure, and/or behavior. Consider three stages in the life cycle of a multicellular organism: a fertilized egg, a live adult, and a postmortem adult. All three states are complex, but they are complex in different respects. All three states possess the sequence information (a genome) necessary to grow a living organism
Why? Why can’t they merely provide a little more support or enhancement of the function, why must they be critical to it? There’s nothing about the genes having to be “critical” to the specified function in Hazen and Szostak’s definition of FI, so why did you make that up?
If the duplicated gene does not affect the function then it is not adding functional information. This is what I meant by being critical to the function. If you knock the duplicate out and function decreases then it is critical to the function.