Intelligent design and "design detection"

You are excluding babies I presume?

1 Like

He doesn’t need any evidence:

That which can be asserted without evidence can be dismissed without evidence.

Therefore he needs no evidence of his own to reject your own evidence-free claim that:

https://www.pnas.org/doi/10.1073/pnas.0701744104

Is this paper relevant to calculating functional information?

How about you “intelligently design” your way to knowing that password first?

I would suspect that the answer here is “yes” – but do not see how this relates to my earlier post.

This thread has been going on for 165 posts. Somehow you have managed to avoid absorbing a single piece of information that has been spoon fed to you by experts in this field.

That’s exactly what I’m talking about.

2 Likes

Let us also point out that nobody has managed to measure FI, so there’s no way to know if 500 bits of FI even exists. A BLAST search does not, by any stretch of the imagination, measure FI. This is a claim based on fantasy.

3 Likes

And this is far from the first time someone has explained this to @Giltil. Yet, here he is, going on as if he has never heard it before.

2 Likes

From what I can tell, all you are pointing to is sequence divergence over time. We see this happening in all populations, even human populations. It is happening naturally. I don’t see what would stop this process over longer time periods.

1 Like

The challenge is idiotic, frankly. You can always just refuse to accept any inference to the best explanation because we can’t empirically demonstrate the ontological basis for historical mutations. Since you have as an axiom that “purely naturalistic processes” can’t produce 500 bits of FI, you will just deny a historical inference like a phylogeny, say, as not being good enough evidence.

Technically no-one can really show that any event that happens is “purely naturalistic.” How would one go about doing that? What could one do to show that fairies, demons, ghosts, mages, warlocks, shamans, or divine beings, are not invisibly and secretly pushing molecules around in a test-tube? Or did so in the past? Now add that many events occurred in ancient history and we can’t, like, travel back in time and re-run history anew while putting it under a microscope.

The only remaining thing that needs to be said is that you have no basis for this axiom you have in the first place. It doesn’t make physical or biochemical sense. There is nothing about how proteins or other biological polymers work that in any way indicates it should be impossible for molecules exhibiting 500 bits of FI or more, to evolve. All you really have is denial based on some axiomatic rejection of evolution not based in any empirical evidence.

2 Likes

Indeed. Such challenges assume all 500 bits are randomly assigned in a single trial, no multiple trials, no selection, no populations larger than 1. If we were doing this with coins (for bits), flipping one at a time (randomizing) until it came up “heads” (for functional), setting that coin down (selecting) and flipping the next, then the statistical expectation is we will flip 1000 times (trials), not 2^{500} times.

2 Likes

The processes of biological evolution, including natural selection, are sufficient to produce 1 bit of FI, rather easily, by a sequence of higher fitness replacing a sequence of lower fitness. And this can go on repeatedly. It can go on, and does. There is simply no argument that higher levels of FI cannot in principle accumulate. Giltil’s argument is non-existent, there is no reason to assert that essentially all cases where 500 bits of FI exist are cases where the FI must have come into existence in one jump.

Note that to use the 500-bits criterion as a reliable indication of Design Intervention, it has to be established that 500 bits is always impossible. Not just difficult, not just impossible in some cases. Giltil has no argument that establishes that. None.

4 Likes

We could certainly run a simulation where a single sequence is allowed to diverge in two lineages through accumulated random changes. I suspect that it wouldn’t take long before random changes produced the 500 bits that Gpuccio et al. are looking for. For that matter, the divergence of introns in orthologous genes is much greater than in the exons. Would this mean introns have more FI than exons?

3 Likes

This is a pretty dramatic burden shift. How do you justify this? How many bits of functional information gain have been empirically demonstrated? If this is orders of magnitude less than 500 bits why would this level not be a provisionally acceptable limit for a design inference?

Functional information means information that has functional coherence. Exons contain genetic code most introns do not.

Obviously because it isn’t a burden shift at all. Accumulating 500 bits of functional information over a 100 million years would require a rate of 1 bit in 200,000 years. Do you really think that that is unreasonable? How would you justify such a claim?

6 Likes

But the measure of FI that’s being alleged for those 500 bits is divergence, period. If that’s a bogus measure of FI, as you allege, then the 500 bits needs no explanation. It’s just divergence, not FI.

1 Like

Nowhere in Gpuccio’s equations is functional coherence measured. That’s something you invented out of whole cloth.

It’s also interesting to see ID/creationists admit that the bulk of intron sequence is junk DNA.

Then the definition of FI seems to have a problem because it would indicate that introns contain way more FI than exons do.

4 Likes

If the function is fitness, then if 1 in 1000 mutations are beneficial, then it takes 51 consecutive beneficial mutations to reach >500 bits of FI.

0.00151 = 10-153

-log2(0.00151) = ~508.3 bits

If only 1 in 1 million mutations are beneficial, it takes only 26 consecutive beneficial mutations to get >500 bits. That means it has already happened in the LTEE.

3 Likes

Indeed. “Functional coherence” is nowhere mentioned in Szostak, Carothers, Griffin and Hazen’s definition of FI, nor in Szostak’s previous definition. So, bogus.

5 Likes

No it isn’t. Gpuccio & Giltil make the 500-bit claim, so the “burden” was always on them to support it.

Falsely claiming a “burden shift” is one of @colewd’s handful of patented moves to try and distract from a “not properly filled out or developed” (i.e. “vacuous”) ID claim. Should we start a game of Bill Cole Bingo?

Bill the Performing Sealion strikes again.

2 Likes