Define "information"? Creationists aren't even willing to define it

Lenski’s LTEE experiment shows that the beneficial mutations are overwhelmingly degradative ones. They increase fitness in a particular environment mostly by breaking genes. IOW by destroying information. As for your concept of environmental information, it seems a bit far-fetched to me.

@Paul King: as far as I can tell from his pre-2005 and post-2005 writings, Dembski defines “complex” specified information as a level of it that has probability P of less than 10^(-150). Since the specified information is -log2(P) this is having an amount of information greater than about 498 bits.

This equation of improbable with complex is what Leslie Orgel did. I wish he hadn’t called it that – it has nothing to do with how complicated an organism is. (Of course Dembski these days computes P differently from Orgel’s original method, or even, for that matter, Dembski’s original method).

Agree. I meant that I should be more careful on my usage. For Szostak’s [sp] FI a change in information has an entirely different interpretation than Shannon Information.

Does Sanford ever say he is using Szostak’s FI? I don’t think so. :smiley:
It up to Sanford to define what he means - and he hasn’t - that is the criticism.

It not clear how FI could be applied Sanford’s GE. I think you would need some reference genome for comparison. Then you might be able to say something about the extant genome versus the reference.

I hadn’t realised that Orgel defined complexity that way. It might be defensible with the right probability computation, but not with Dembski’s. Which don’t seem to be practical anyway.

Environment selective pressures are always changing. Herd immunity, salinity, invasive species, antibiotics, pesticides, agricultural practices, all this dynamic can be parameterized. If not information, what is it?

Let’s take something simple, the size of animals. Historically, there have been small horses, and big horses, small camels and big camels, small lemurs and big lemurs, and on and on throughout the animal kingdom. Creationists generally claim that the basal form was created with all the information which would be expressed in their descendants. That is how it is if information cannot arise by mutation. To me, specifying big and small in one breath seems incoherent. It makes more sense that there is variation, which by definition is not specified, which is subject to environmental selection, along with some drift and happenstance.

Unless the function is fitness, then FI is gained.

Of course the experiment doesn’t really show that in general beneficial mutations are mostly “degrative ones”, because you can’t just generalize to all environments. What the experiment shows is that when you take an organism from it’s much more complex natural environment and put it into an extremely simple synthetic environment, many of the adaptations that are beneficial (increase fitness) in the natural environment are no longer necessary, and hence stop being maintained by natural selection. And since bacterial population sizes are vast, and the selection scheme employed in the LTEE centers primarily around reproductive rate (rather than prolonged periods of survival of various environmental stressors, predation etc.) which genome size is among the strongest negative influences on, deletions of unnecessary genetic material becomes selective advantageous in that situation.

But, you can’t just extrapolate that to evolution in general. I have explained this to you and other ID-proponents before on numerous occasions, yet none of you seem to have much capacity to take in this message. Can I get some sort of acknowledgement that you at least understand what I am saying?


Ok, but that doesn’t mean this game has any correlation with biological reality.

In this scenario, information is back to its previous value. The biological system (thanks @AllenWitmerMiller) gained no information in a historical sense. It’s as if one printing of a book has a typo and it was fixed in the next printing.

It’s interesting many people attempted to answer my question, yet no one actually gave an example of an observed mutation.

Let’s suppose that Lenski’s experiment had shown evolution occurs without (to use your terms) “breaking genes” and by creating “information.”

What would you expect to have been observed. Please be as specific as possible.

It is rather disappointing, given the effort I made to write out a layman’s level explanation for you, that you lack the imagination to take it just one small step further. Imaging a new mutation that steps function to a level that did not exist before.

For example, imagine the first printing of the book read …

“Let’s eat Grandma!”

and the later printing corrected the typo to properly read …

“Let’s eat. Grandma!”

One small change, two very different functions.

1 Like

Information CHANGE is always measured in respect to some point of reference. THIS is exactly why it is important to define what is meant by information, rather than prevaricating what is meant by “loss of information”.

  1. Sanford seems to intend that a loss in fitness is a loss of information.
  2. Lenski showed a gain in fitness relative to digesting citrate. That should be a gain in information.
  3. The bacteria that gain in fitness WRT citrate lost other information. That lost information can’t be Functional Information WRT citrate. IT MIGHT be information for something else.
  4. EXCEPT that information isn’t really lost: it still exists in a separate population, on another petri dish in Lenski’s freezer. Relative to Lenski’s freezer, the total information has increased because of the new citrate function.

Further, fitness is always relative to the environment. We could say the bacteria lost fitness WRT the original substrate (?) the moment it was introduced to a citrate environment, BEFORE any mutations occurred.

So you tell me, in your opinion, does Sanford intend Functional Information as defined by Szostak? What the the reference point by which change in information should be measured. Let’s put an end to the prevarication.

1 Like

Many people succeeded with their attempts, but you just gave silly excuses for why they didn’t count. Such as saying a reversal mutation is in a “historical sense” not a gain of information. That sounds to me like a gain of information. If the information was lost at one point, then later re-gained, then it decreased and then increased again. Obviously.

If you want an example of a mutation that increases information without explicitly defining information in a quantifiable way, you can just keep coming up with these excuses after every example we offer.

So, how about the evolutionary origin of T-URF13? The gain of the gene VPU1 in HIV virus? How about this insertion mutation:
That creates a novel fusion protein that retains the function of the original protein without the insertion, but the insertion also gives it a new beneficial phenotypic effect.

So here’s an example of a mutation, specifically an insertion of a piece of DNA encoding 23 amino acids, originating from the phage genome, into the reading frame a bacterial protein coding gene, resulting in a novel fusion protein, which subsequently is simultaneously capable of maintaining the original function of the pre-mutation bacterial gene, while it gains the biochemical function of being transported to the outer membrane(one biophysical gain of function), results in membrane vesicle formation (another gain of function), which confers increased temperature tolerance (which is a beneficial phenotypic effect).

One insertion mutation to a gene, making it larger, giving it two new biophysical functions, that confers a novel beneficial phenotype to the bacterium. Is that good enough?


Antifreeze genes have been discussed in many threads in this forum. For codfish…

De Novo Gene Evolution of Antifreeze Glycoproteins in Codfishes Revealed by Whole Genome Sequence Data

Gene gain:

Most likely, codfish afgp s arose from entirely non-coding DNA making them type I de novo genes

and gene loss:

Moreover, afgp has been subsequently lost in one lineage of codfishes, analogous to the loss of afgp in non-Antarctic notothenioids.

See also: Molecular mechanism and history of non-sense to sense evolution of antifreeze glycoprotein gene in northern gadids

Evolution of an antifreeze protein by neofunctionalization under escape from adaptive conflict

The development of blood antifreeze opened a new niche for these fish, and that in turn altered the landscape of selection on other features. Sean Carroll is featured in this YouTube:

The Icefish Has Clear Blood

Note that YEC Shaun Doyle at CMI, while arguing against the evolutionary significance of antifreeze genes, does not dispute that they are indeed an observed mutation.

Strong selective pressure in the Antarctic waters would make tandem duplications of AFPIII a likely response to the conditions in order to increase the amount of AFP manufactured, and would kill off any eelpout that didn’t have the gene. Therefore, in short, the answer is yes, random mutation and natural selection is a likely mechanism for how this AFP and many others were produced.

Antifreeze protein evolution

Understood. Given the broader context of ambiguity about what constitutes an increase of information, I thought it might be fruitful to establish a concrete case where we agreed there was an increase of information and then work out from there. Clearly I was mistaken.

The biological example I think is most straightforward is antibody generation. Every day, people get vaccines and a few weeks later, after a process of mutation and selection, their B cells can make proteins they couldn’t make before which have specific functions. In other words, the immune system stores more information about the environment and the antigens that exist in it.


But that wouldn’t be the sort of “information” the creationists are asking about, I suspect, since it does not involve any long-term changes to the genome of the organism itself.

The more general response, of course, is that whatever example of information you give, the creationists will immediately change the definition so that it no longer fits. This is why they refuse to commit to a fixed definition. They know the game they’re playing.


Exactly; there’s simply no sane way to conclude that the immune system of an immunized (by vaccine or infection) person has not gained information, extremely useful information. Also, that new information (the sequences of immunoglobulin and T-cell receptor genes) is different in different people, even in identical twins.

If God designed our bodies in an active process and doesn’t want us to accept evolution, why did He design our immune systems to use evolution in real time? Is God some sort of prankster?


Let’s take Sanford’s paper on the genetic entropy of the flu virus. In this example, the reference point might be the flu genome at the end of WW1.

OK, now how do you want to measure Information?


Why? He grossly misrepresents the data and does not appear to know the difference between strains and subtypes, a very important thing when one is looking at the genetics of a virus with a segmented genome.

1 Like

The immediate problem is that at any time, including the end of WW1, there are thousands of variants of influenza, mutating and recombining. Upon what basis is any given one a reference? If that basis is human susceptibility, how is the information content of the ancestral reservoir variants leading up to that accounted for?

A salient shortcoming of ID and YEC in common is the framing of biological information as intrinsic and idealized. How is it far fetched to recognize the functionality of information for an organism relates to ascertaining the environment and functioning in the environment? The value of biological information is contextual. Whether a host population is immunological naive or possesses herd immunity defines the value of particular variations of antigenic sites. There is no intrinsic way to characterize that. The intracellular biases of a new host species defines the value of synonymous mutations in the virus. There is no intrinsic way to characterize that. Affinity also cannot be intrinsically defined.


I think that’s OK for my purpose. As a point of reference it must include the H1N1 genome at that time, which is the sort of specifics I’m looking for. I still don’t know what definition of Information Gil thinks is correct (or thinks Sanford thinks is correct).