Comments on Gpuccio: Functional Information Methodology

Yes, I guess that "the chance that a tornado sweeping through a junkyard might assemble a Boeing 747 » is not null. But we would rather not consider this type hypothesis if we are to remain in the domain of science!

There are only 3 causes that can explain an object, i.e., chance, necessity or design. If you can rule out chance or necessity or a combination of both, then design is the explanation and you don’t need to compute the probability of design to draw a design inference. According to ID theory, objects exhibiting high level of FI can’t be produced by chance or by necessity or a combination of both. Therefore, they are designed. This is ID in a nutshell.

As far as biological objects are concerned, I think this is indeed the only real issue here. ID theorists think that as a rule of thumb, most fonctional protein are very rare in the sequence space. If true, ID wins. If wrong, ID may be in trouble. My take is that most functional proteins are very rare in the sequence space. But I will let @gpuccio elaborate on this point if he want.

You are committing here the Texas sharpshooter fallacy.

LOL. @gpuccio is guilty of precisely this sort of equivocation here.

3 Likes

Why are you using labeling as an argument technique?This is what we hear from politicians. You are asserting that both arguments are wrong why do you think so?

Evolution does not fall under either category. So if this is really the foundation of ID, then ID is a 20 year endeavour whose abject failure is explained by the fact that it is founded on a fallacy.

So do most (actually all) evolutionary theorists. So this it not a point on which ID creationism can be supported.

That is exactly the point.

3 Likes

I lost this train of though - Bill asked something about sequences from biological systems (I think), and I was trying to say that cells and Turing machines are performing equivalent computing functions, and cell are Turing Complete (capable of performing computation).

1 Like

Well for starters, I can use the same argument to prove that it’s impossible (…beyond probabilistic resources…) to flip a coin 500 times. There might be just a wee flaw with that methodology.

2 Likes

Examples? The only cases of molecular convergence I am aware of occur in proteins that are shared through common ancestry, such as in the case of the prestin gene in certain mammalian clades. It isn’t surprising that evolution would find the same mutations in very similar proteins in separate clades. Again, this is a case of finding function with a common starting point. This can’t tell us how many starting points there are to begin with.

1 Like

Biological reproduction is not analogous to a tornado in a junkyard.

Which of those would you use to describe the observed process of descent with modification? We can directly observe new organisms being born, and they carry new mutations.

What I would be interested in is an ID supporter finding mutations that separate humans and chimps (or any ape, for that matter) and determining which of those mutations evolution could not produce, and why. I have yet to see any ID supporter do this.

3 Likes

I can explain the graph but I am more interested if anyone of the ID skeptics can. If no one does in a day or so I will.

We have to consider all hypotheses and evaluate their relative likelihoods, you can’t just dismiss things out of hand.

Yeah if you can rule them out. But you haven’t done that. You’ve merely said the probability is low. That’s not ruling it out, that is merely to state that it has a low frequency of occurrence over some defined interval of time.

According to ID theory, objects exhibiting high level of FI can’t be produced by chance or by necessity or a combination of both. Therefore, they are designed. This is ID in a nutshell.

But we just agreed the probability, even if we assume the most implausible of all the chance hypotheses, the tornado in a junkyard, isn’t zero. So we simply can’t say it can’t be produced “by chance”.

And we’ve seen no work be done to rule out necessity at all, or any combination of chance and necessity. For example we’ve seen no attempt at estimating the probability of anything using evolution, which is NOT like a tornado in a junkyard. Evolution is like I described me descenting through ten generations of ancestors, that’s how we get from some ancestral sequence to one that looks very unlikely after the fact.

So we still have to see a probability for the design hypothesis, you’re not getting away from this.

What I find strange about the ID position on this point is that it is based on nothing at all, just a hunch. A speculation. There’s no evidence for it, and only evidence against it.

That’s exactly what is wrong with the ID argument, and I’m glad you’ve finally realized why. Take one of those functional proteins in humans that Gpuccio has attempted to show has grown larger in the lineage leading to Homo sapiens. Over millions of consecutive generations, this protein has incrementally grown larger by adding hundreds of amino acids.

In the same way, my genome has accumulated mutations through generations. So now, after the fact, the protein looks like a very unlikely combination of amino acids, and my genome looks like a very unlikely combination of nucleotides. So now you come along and says, hold on, you’re drawing a target around your genome, that’s the texas sharpshooter fallacy.

Correct! So exactly the same thing is wrong with Gpuccio focusing on a particular protein having grown incrementally larger over 500 million years, and then declaring after the fact that the outcome of this cumulative process of mutation adding amino acids, looks unlikely in hindsight. Yes it does, but so would any such long process of mutations. Whether it accumulated amino acids, or substituted some for others, or deleted some. However such a 500 million year long process of incremental change happens, it will look unlikely after those 500 million years. What does that tell us? Nothing, it tells us nothing. It’s fallacious thinking, and it’s a good thing you are able to see why.

3 Likes

Okay, thank you Bill. I’ll be waiting.

1 Like

I’m sorry but this reads to me like it’s been written more to sound like it’s technical and sophisticated than to actually convey a substantive point of any real value.

A protein is the product of an encoded description; the position of the stars in the night sky are not. Neither are the locations of islands on the open sea. Neither are weather patterns and tornadoes.”

I’m reading it and it just raises the question: So what? What does that have to do with how protein coding genes evolve?

3 Likes

Try reading any of David Abel’s publications if you find yourself wanting more of that.

Taking a look at this thread, I am surprised at the definition of FI that everyone seems to be using. In a long thread at The Skeptical Zone last December (see here) gpuccio and numbers of other people debated whether there was any sound argument for gpuccio’s assertion that 500 bits of FI could not be produced by ordinary evolutionary forces such as natural selection. You could argue that if the definition of FI were that there is a function, and it exists only in a small subset of the sequences, one a fraction less than 10^(-150) of all sequences, then natural selection cannot reward changes that get close to, but not into, that set.

Actually it finally became clear in that discussion that gpuccio was not assuming that there was no function outside of the set. So I would have said that in such cases there were possible paths for natural selection and mutation to take to get into the set. But for such cases gpuccio was asserting that natural selection would just not be strong enough to do the job. I disagree with his reasoning for saying that, but I was at least happy to have that matter cleared up. However in this PS thread you all seem to be using the definition that function is zero outside of the target set.

The definition of FI used by Jack Szostak in 2003, and corrected a little by Hazen, Griffin, Carothers, and Szostak in 2007, is the one gpuccio said he was using. However that definition has FI be computed for each possible sequence: it is minus the log (to the base 2) of the fraction of all sequences that have a value of function greater than or equal to the value in that particular sequence. There is in that definition no assumption that function is only nonzero in one subset of the sequences, and that function is zero elsewhere. I do not know why gpuccio has changed his definition to have zero function in most sequences. You are all accepting this modified definition of FI. Maybe you should not.

8 Likes

A very important point, because obviously if a single mutation in a larger non-coding sequence acts to change the sequence to recruit a transcription factor and produce a translatable open reading frame with a biological function, then pretty much the entire sequence existed before the mutation that made it functional.
And that sequence in turn has evolved approximately neutrally, potentially for millions of generations until the mutation that made it functional occurred. That means the sequence didn’t just pop into existence as if a tornado in a junkyard, rather it also evolved iteratively by mutations. It could even have been drifting from some previous functional gene, like an ancient transposon or another type of pseudogene. That means the “initial condition” in that lead to the origination of this de novo gene isn’t random either, and itself was largely the product of natural selection.

At what point does ANYTHING happen at the genetic level that could really be accurately compared to the tornado in a junkyard? It seems to me there’s never any such thing occurring.

Well I agree with you their @Joe_Felsenstein. There is no reason to think 500 bits of FI is inaccessible, and have said it several times.

2 Likes

Have you ever considered the chance that vibrating air molecules would assemble a tornado. The probability is really small. Yet it happens often.

Should we conclude the tornados are intelligently designed? Why aren’t the ID proponents pointing to tornados as examples of intelligent design?

Or maybe when you have an energy flow, and a homeostatic process is part of that energy flow, what appear to be highly improbable arrangements can occur more often than we would otherwise expect.

2 Likes

I’m not sure how many non-creationists are accepting this definition, except to the extent that they are doing so provisionally with the idea of providing @gpuccio the rope he needs to hang himself.

2 Likes

Last comment I’m posting from UD. Maybe Upright Biped will join the conversation here.

478

[Upright BiPed]

(https://uncommondescent.com/intelligent-design/controlling-the-waves-of-dynamic-far-from-equilibrium-states-the-nf-kb-system-of-transcription-regulation/#comment-682966)

Faizal Ali,

“The problem is the inclusion of the latter under the category “semantic” information is the point in question. The large majority of experts who do not accept the creationist argument also do not accept the creationist claim that the physical and chemical interactions of the molecules involved in biological processes are directly analogous to sort of semantic information involved in, say, a written novel or a computer program.”

No, sorry, it is not in question. The physics of symbol systems remain the same regardless of the medium, or its origin, and your attempt to dismiss the issue as a “creationist claim” displays an embarrassing lack of knowledge regarding the recorded history of the issue. You are likely unaware of this because you have not educated yourself on the literature. Additionally, empirical science is not established by consensus; it is established by what can be demonstrated and repeated. I would think you might have been aware of this, but I could be mistaken. In any case, if you find someone who as shown that the genetic material is not rate-independent, or that the process is not irreversible, or that no distinctions need be made between laws and initial conditions, or perhaps if you find someone who has solved the measurement problem, or any of the other physical observations recorded in the physics literature regarding symbol systems over the past half century, then be bure to let us know.

Until then, I plan to stick with the science. You are free to continue otherwise.

Ahh yes thanks for bringing that up. I now recall that one of Gpuccio’s arguments in response to various criticisms, was that, if the protein in question had evolved from some other functional protein, then it simply doesn’t exhibit 500bits of FI.

So we pointed out that then his case rests on this idea that there’s some protein that is an isolated target in sequence space, and we asked him to show that such a sequence exists. He then insisted on reverting the burden of proof and claimed it’s our job to show that protein X could have evolved from protein Y, and until we do that we should automatically assume the function is isolated and that evolution is impossible until proven otherwise.

3 Likes