Methinks it is sort-of like two weasels

Viewing the event before the fact, yes. No matter when we pick a specification, the probability, viewed before the fact, will be the same.

Which you could say Behe does, after deriving a rate for chloroquine resistance, Behe then applies this to a hypothesis concerning protein-protein binding sites, and turns to the malaria data set (both with humans and with the parasite), and with HIV, to see how the hypothesis is borne out.

Perhaps you would like to explain where Dawkins does so. The weasel program - which is covered in only a few pages in The Blind Watchmaker - is intended only to show the advantage of what Dawkins calls “cumulative selection” over “single-step selection” (which is essentially random guessing). It is not billed as or intended to be a simulation of biological evolution and Dawkins says that it is not accurate as such.

(It is interesting that criticisms focus on the relatively unimportant weasel program and not on the biomorph program that occupies the bulk of that chapter - and is intended to be closer to biological evolution, though still not a simulation).

4 Likes

That is certainly not a definition of design. Nor can it be considered evidence of design. After all, unselected steps will occur (quite frequently) and evolution is hardly going to care about whether it is including unselected steps or not.

Any argument to the contrary makes the mistake of confusing probabilities before and after the event. A particular unselected step may be unlikely before the event, but that does not mean that unselected steps do not occur nor does it mean that evolution cannot incorporate those that do occur.

2 Likes

Unsupported claim.

2 Likes

Picking a specification with knowledge of the event cannot be done before the fact. As I point out the a priori probability of such a specification being met is irrelevant to choosing design. The probability of the sequence having such a specification is the relevant probability. The probability of a sequence of 500 coin tosses having such a specification is clearly much higher than 2^-500.

2 Likes

The sample mean is not relevant, you are not using a likelihood for estimating a population parameter. The interpretation you are using for the likelihood is not correct.

Routes available to evolution are not part of your calculation, even to name the particular pathway you claim is improbable.
You further presume that all steps are “not selectable” with no justification. You do not consider the probability of “scaffolding” that is known to produce IC structures. You do not consider sexual mixing, which can combine sequences in ways that bypass “non-selectable” steps.

Most of these calculations are not possible outside of special circumstances such as phylogenetic analysis. This is why we don’t see biologists using this approach at all. ID researchers have no special knowledge of mathematics that could make this possible. I am not the first to make these criticisms.

5 Likes

Vision can be of existential importance, and so selection is highly operative. There is plenty of intermediate functionality, so while the camera eye is complex it is not irreducibly so. Form follows function, and function responds to environmental pressure. What would be the barrier to convergent evolution of the camera eye?

4 Likes

First off, cephalopods don’t have the vertebrate camera eye; they have a distinct cephalopod camera eye. Second off, Nilsson D., Pelger S. A pessimistic estimate of the time required for an eye to evolve. Proceedings of the Royal Society of London, Series B 1994; 256:53-58.

Oh, I’d like to point out that some arthropods (and members of several other phyla too) also have a form of camera eye, and some cephalopods have camera eyes that lack some of the components of the canonical camera eye.

6 Likes

ID methods are unique in that the probability of an event-type is considered to be smaller the more often it is observed. Normally we think that events which occur more often are more likely, not less.

6 Likes

Let’s pick 500 heads in a row before the fact.

The probability of a sequence meeting a specification is what is relevant.

But who talks about the probability of having a specification? Probability is about events.

But I was responding to people here defending it.

This is definitely a definition for design, you may say it doesn’t work, but it’s still a definition. But unselected steps are indeed more difficult to get through, and the more such steps there are in a row, the exponentially more difficult it is for evolution to get through them. Let’s say each step is probability .1 to get through, then two such steps would be .01 probability, and three such steps would be .001 probability, etc.

You are painting the bulls eye around the bullet hole. If we are talking about evolution, you would be calculating the probability of drawing a better hand than the other people at the table.

2 Likes

That’s the Sharpshooter fallacy. What you are ignoring is all of the beneficial interactions that didn’t happen. Using the same approach, you would be amazed that there are winners in so many lotteries because the chances of winning are 1 in 100’s of millions. What you would be ignoring is all the people that lost.

In the same way, there are many, many neutral mutations that have accumulated in any given genome. There may be many, many chances for a new mutation to interact with one of those accumulated neutral mutations and produce a beneficial phenotype. It is incorrect to single out the one beneficial phenotype that did evolve and calculate just the probability of that single phenotype.

3 Likes

Yes, I am, I am using the maximum likelihood value for estimating the mean, which is the sample mean.

No routes were excluded, two routes have been identified, and it’s reasonable to calculate the probability of resistance based on the rate at which it arises, 1 in 10^20.

The justification is that the chloroquine resistance rate is about the square of the atovaquone resistance rate. Evolution was not restricted in the interval of interest, scaffolding and any other path was available during this time.

Chloroquine resistance is not representative of evolution in general. Only very rarely are specific adaptations limited to just 1 or 2 mutations in the entire genome. Just looking at life in general will tell you there are many, many potential pathways for evolution to take, not just 1 or 2.

3 Likes

The improbability of the eye evolving in the first place.

Several comments: if the steps are independent, it is indeed remotely unlikely to develop just once! 99%^2000 is 1.86 x 10^-9.

At step 6 a lens appears! By magic?

Then we have:

Source: Claremont.edu

The response obtained in each generation would then be R = 0.00005m, which means that the small variation and weak selection cause a change of only 0.005 % per generation. The number of generations, n, for the whole sequence is then given by 1.00005^n = 80 129 540…

© Copyright Original Source

But isn’t this assuming that the selectable variation per generation becomes fixed in each generation? But with a generation time of only 1 year, I don’t think there is enough time for this to happen. I would propose then the following:

1.0000 + (.00005 * (1 + .00005 * (1 + .00005 … [n times] = 80,129,540

So that each time, the variation for the next step occurs within the fraction of the population that varied in the current step.

Finally, as they admit, the neural processing required for the more complex eyes was completely left out! This is a crucial factor, and would substantially reduce the probability.

Which makes it more surprising if evolution did it…

The probability of evolution as a cause gets smaller the more we see convergent structures. Let’s say the probability of an eye evolving is .1. Then the probability of it evolving twice is .1^2, and so on. It gets smaller, exponentially smaller.

No, you can indeed compute the probability of drawing a full house, before the fact! Before the bullet is fired.

That’s one probability. Another would be the probability of evolution generating a given structure.

Are you saying we can’t observe what evolution does, and compute a rate from that?

Now you are confusing the probability of an event viewed before the fact (1 in 100’s of millions) with the probability of an event after the fact (some people won). I’m not ignoring all the people that lost, that is why the probability (viewed before the fact) is 1 in 100’s of millions.

But Behe looks at what evolution did, including neutral mutations and existing variation.

But Behe is interested in the limits of what evolution can do, so he starts with a rare event (chloroquine resistance), and calculates a rate from there. This is a valid observation, regardless of what evolution can do elsewhere (Behe even has a chapter entitled “What Evolution Can Do”).

I don’t think you understand the terminology. The sample mean (a statistic) is an estimator of some population parameter. For a Normal distribution this estimates \mu, the population mean (and you also need to estimate the variance). You haven’t stated what it is you are trying to estimate or what the underlying distribution should be.

Example: To estimate \mu \text{ and } \sigma^2 for a normal distribution, you use the sample mean and variance to calculate the join probability (Likelihood) of all tions. and choose \mu \text{ and } \sigma^2 to Maximize the Likelihood. It turns out that the sample mean and variance ARE the maximum likelihood estimates of \mu \text{ and } \sigma^2, but MLE is a more flexible method of estimating parameters than Method of Moments (which is what people learn in an intro stats class). The sample mean is a Maximum Likelihood Estimator - it is not a likelihood.

But I digress. My not sure what you are trying to do, but you probably want a proportion or a rate, not the mean.

More, but too sleepy now.

1 Like