Stairways to Understanding (tentative books project)

Some evidence exists the Zinc fingers may be multipurpose in binding to proteins and maybe RNA. The information I’ve seen so far on this is murky.

It seems to me selection would prevent co-evolution, not facilitate it.

Would you not start with a much less ambitious fork of the tree to start with? Creationism generally does not allow for fish to amphibian, but that would be a better start for common ancestry “proof of concept” not requiring miracles. [disclaimer: proof used in the sense of evidence here]. It seems to me that you have to disqualify even minor common ancestries, ultimately any populations which just no longer interbreed, to make this case that miracles are being appealed to. If this is not so, where would you draw the miracle versus nature line?

1 Like

Yes. The immune cells are multiplying and clonally expanding in response to antigen. They also are undergoing somatic hyper mutation, in which mutations are added into the genome to increase diversity in response to the antigen. Cells with productive mutations survive and expand. Cells with detrimental mutations die off. This process is called affinity maturation and is a normal part of acquired or adaptive immunity. Figure 5 in that paper showed sequence analysis on B cells cut out of a histological section of tissue from an animal that was infected with salmonella


That would make the adaptive immune system impossible.


How is the premise of common ancestry reasonably well established? This is a serious question. Given that almost all the details and evidence from millenia ago have disappeared, how do we know that ancestry as we understand it today was happening then?

Is it reasonable to infer that the same kinds of genetics and genealogy that we observe today have explanatory power for historical periods, even though we have so little evidence about what happened then?

And if so, wouldn’t it be reasonable to infer that the same kinds of genetics we observe today (nested hierarchies in supra-species clades, speciation, genetic drift, mutations in gene regulatory networks, etc.) have explanatory power in natural history?

Chris Falter


Selection would, in some cases, act against fixation of a single change. Then again, in other cases it would act for fixation. And once there were a single change, fixation would likely favor the compensating change in the other sequence. You should recognize that binding isn’t all or nothing. A change in one site would only slightly reduce the binding efficiency in a protein with multiple zinc fingers, and in fact there are wobbles already.


Thanks for the suggestion, but what will go in the book will be mostly that which evolutionary biologoists have themselves said is problematic. For entire families of organisms, the eukaryote is clearly the most contested and problematic for evolution.

Abiogenesis will be included since it is part of the stairway though not part of biological evolution.

Abiogenesis and eukaryotic evolution are at the base of the tree of natural emergence.

The stairway to a multicell Eukaryotic animal is another stairway worth focusing on:

The problem with the promotion of total evolution (abiogenesis followed by biological evolution) is that it focuses on the smallest steps and represent this as evidence the big steps are bridgeable.

Focusing also on skeletal structure and fossils misses the soft tissue transitions problems such as those I pointed out for the eye with the lentis retractor muscle in fish to the ciliary muscles in humans.

Though evolutionary biologists will point to the cloacal stage of embryonic development, this backward ordering of the anus and vagina in fish relative to humans is still not an easy transition:

@Sal, what information do you expect this sort of figure/analysis to give you? What specific hypotheses does this figure test? What insights does the result give you about this particular zinc finger protein?

1 Like

Thank you for your question.

[As an aside, many if not all aspects of the Zinc Finger discussion will likely NOT be in the abridged High School book. The stairways and steps highlighted will most likely be Origin of Life and Origin of Eukaryotes and isolated transitions.]

But to your question, the answer is the infeasibility of making such a protein like ZNF136 (but not specifically ZNF136) by random mutation. This is an extension of Behe’s Protein-to-Protein binding probability argument, but instead it focuses on Protein-to-DNA binding. ZNF 136 was chosen because the Zinc Finger array is very apparent even in the FASTA file, but the considerations put forward apply to large Zinc Finger arrays (like 4 or greater in general). I saw on protein that had 30 zinc fingers, btw.

Which ever evolutionary route to a KRAB Zinc Finger, random segment duplication, insertion, deletion, point mutation will cause the zinc finger to bind (if at all) to different locations at different affinities as it evolves.

Suppose we take ANY arbitrary tree that we might postulate through phylogenetic methods. I provided one possible unrooted phylogenetic tree automatically generated by MEGA, but a phylogeneticist can make any tree he thinks is more accurate than the one I made. I put forward a tree as an hypothetical claim in an argument by contradiction, not that I think it was true to begin with…

The arrows show the actual physical location of a zinc finger in relation to the hypothetical phylogenetic tree. The diagram was to help estimate feasibility of a particular segmental duplication/insertion/deletion followed by point mutation. But those problems possibly pale in comparison to the problem of making segmental duplications/insertions/deletions that make coherent specificity and affinity of the protein toward a target DNA sequence (like a new transposon or viral insertion or whatever).

The evolving protein will bind (if at all) to arbitrary DNA segments with varying levels of affinity as the Tandem Zinc Finger array is evolved by segment duplication/insertion/deletion and point mutation. I

A google search will point to a few papers on co-evolution of Zinc Finger proteins with transposons. It’s not a given that a process of random mutation would enable a Zinc Finger protein to target the first rogue transposon, nor any other DNA targets like the CTCF binding motif or 5S DNA, etc.

It is estimated 1-3% of proteins in the human genome are zinc finger proteins.

Further, because of some work in synthetic biology makes man-made zinc finger proteins, we have better models of zinc finger construction than others. Thus we are beginning to appreciate the difficulty of making zinc fingers that hit the right target and not others. Even our best models of how to make zinc finger are still pretty faulty even though the domain is pretty small.

The question of mechanistic feasibility of the evolution of large zinc finger arrays has NOT been resolved, but rather accepted on FAITH. The claims of co-evolution of KRAB zinc fingers with transposon changes (or any DNA target) is a faith statement, not one based on mechanistic feasibility.

It’s been asked before. You still haven’t provided an answer. Apparently you think it tests all possible models of evolution of the protein, but nobody can see why you think that, even after the “explanation” you have just given. Now, my suggestion, if you really wanted to test the evolution of the protein, would be to put it into a comparative analysis. What does this protein look like in related species? What does it look like in paralogs? Put in enough sequences from orthologs and paralogs into a well-designed phylogenetic analysis (hint: not “automatic in MEGA”) and you might be able to study how it evolved, in fair detail. A tree of different zinc fingers in the same protein is not a good choice for this exercise; I could explain why if you can’t see it.


What specifically are you proposing this means? Do you mean there’s like a nonfunctional protein being expressed, and this incrementally mutates into the Zinc finger, or an ancestral zinc finger that is similar to this one, from which this one evolved incrementally by point mutations? Or something else?

A few comments: The “tree” you draw has absolutely nothing to do with the origins of ZNF136. How you go from an alignment of the different zinc fingers (that is all the tree conveys) to evolution or origins escapes me (and I suspect most others in this discussion). This is in addition to the fact that Behe has been proven to be wrong, which makes all of this even more irrelevant.

Worse, @stcordova, you don’t even know if ZNF136 binds DNA, which zinc fingers may be involved, which may be linkers, etc., etc. You don’t know whats DNA sequences it may prefer, the differences in affinities for different motifs (if there are any), the relationships between ZNF136 and other zinc finger proteins, etc., etc. This reduces all of your grand pronouncements to bluster, I am afraid.

Of course, while most cannot fathom why you would present the alignment of the different zinc fingers as you do, the “tree” in fact may hold answers, or at least clues, to some of the biochemical issues you are so blithely ignoring. In my opinion, your hatred of biology and evolution has blinded you to some opportunity here. And I would submit that your focus on anti-evolution strawmen will continue to blind you.

1 Like

You don’t either, but you believe it evolved naturally. The sword cuts both ways.

Evolutionary biologists routinely say life is poorly designed. (Can we say Ayala and Avise?)…

But that is not the view of Princeton biophysicist William Bialek.

At the top of the Stairway to Understanding are biological systems that exceed the collective capability of our best scientists.

Birds like the Arctic tern apparently leverage quantum mechanical spin chemistry systems that serve as nano-sized magnetic compasses (far better than most man-made magnetic compasses!). World-renowned chemist Marcos Eberlin in his book foresight points out the best chemists in the world cannot synthesize such a magnetic compass with spin chemistry!

Bialek argues that life is more perfect than we imagined:

Evolutionary biologists like Avise and Ayala are shown to be wrong again.

The theme of my book project is life is designed to show it is designed by a creator, and to also teach us about the designs and technology – like quantum spin chemistry.

Since we’re operating at the level of slogans here, I’d like to make my contribution:

Fallacy of the excluded middle.

Strawman portrayal of mainstream biology.

Par for the course.

Chris Falter

P.S. I share your belief that nature does manifest God’s power and marvelous creativity.

Quote them. Pretty sure they’re saying particular entities(like parts of some specific developmental pathway, or system of gene expression, or piece of limb-morphology in some specific organism) in living organisms, as opposed to just “life”(as in every living thing, or the very concept) are sub-optimally designed for some of their functions.

There are of course innumerable adaptations that are quite far removed from their absolute physical limit of performance. The vast majority of enzymes for example are far from the highest possible rate of catalysis. See:
Bar-Even A, Noor E, Savir Y, et al. The moderately efficient enzyme: evolutionary and physicochemical trends shaping enzyme parameters. Biochemistry . 2011;50(21):4402‐4410. doi:10.1021/bi2002289


The kinetic parameters of enzymes are key to understanding the rate and specificity of most biological processes. Although specific trends are frequently studied for individual enzymes, global trends are rarely addressed. We performed an analysis of k(cat) and K(M) values of several thousand enzymes collected from the literature. We found that the “average enzyme” exhibits a k(cat) of ~0 s(-1) and a k(cat)/K(M) of ~10(5) s(-1) M(-1), much below the diffusion limit and the characteristic textbook portrayal of kinetically superior enzymes. Why do most enzymes exhibit moderate catalytic efficiencies? Maximal rates may not evolve in cases where weaker selection pressures are expected. We find, for example, that enzymes operating in secondary metabolism are, on average, ~30-fold slower than those of central metabolism. We also find indications that the physicochemical properties of substrates affect the kinetic parameters. Specifically, low molecular mass and hydrophobicity appear to limit K(M) optimization. In accordance, substitution with phosphate, CoA, or other large modifiers considerably lowers the K(M) values of enzymes utilizing the substituted substrates. It therefore appears that both evolutionary selection pressures and physicochemical constraints shape the kinetic parameters of enzymes. It also seems likely that the catalytic efficiency of some enzymes toward their natural substrates could be increased in many cases by natural or laboratory evolution.


The word “Perfection” has both metaphysical, aesthetic, and philosophical baggage. It is not really a mathematical or physical quantity that can be measured, since “perfection” is in the eye of the beholder. So when Bialek uses the idea of perfection, it is in a figurative subjective sense. What I think he means is that there exist structures and systems in biology at the extreme limit of what is possible from physics and this seems to be his informal definition of perfection.

I’ve gone on record as saying Bill Dembski’s notion of Specified Complexity is too clumsy and cumbersome to analyze designs. I am mildly critical (albeit indirectly) of Dembski’s (and Marks’, Ewert’s, Bartlett’s, and Holloway’s) approaches to evaluating designs. Although I consider myself still on friendly and cordial terms with the aforementioned individuals.

What we can say outside of the philosophical notions of “perfection” and “design” is that when a system of objects is in a state that is at an extreme deviation from the expected mean of random processes, we can describe these deviations from the mean expectation quantitatively and objectively.

For example, here is an example of something that has a modest correspondence to some features of biology subject to the binomial distribution:

This configuration of fair coins is a maximally extreme violation of the law of large numbers. The coins are 100% heads, are 100% (or close) to the same orientation with the tilt of George Washinton’s head the same in terms of degrees, and they are arranged more or less in rows and columns. All these are violations of the law of large numbers, and at least as far as heads and tails, the set of fair coins is at the maximum number of standard deviations physically possible from the mean of the binomial distribution (approximated as a normal distribution).

Whether the configuration of the coins is “perfect” in the philosophical or aesthetic sense is in the eye of the beholder, but it is a “wow” configuration in the empirical and statistical sense in that it is is the most extreme deviation from ordinary expectation possible. The coins are arranged to achieve the highest possible “wow” that one might be able to achieve from simple configurations of fair coins, whereby “wow” is the deviation from the natural expectation of random processes. It was this sort of simple analysis that the notion of specified complexity could not elegantly resolve, and hence that is my main reason for dispensing with the notion of specified complexity altogether.

At the biological level, the law of large numbers CAN, with appropriate caveats, apply to molecules like amino acids, nucleotides, lipids, proteins, DNAs, RNAs, carbohydrates. There are exceptions where the law of large numbers is superseded by machinery and systems that impose organization and prevent the propagation of disorganizing effects of what might colloquially be called “Brownian motion” (a metaphor for random molecular motions and collisions).

As with the coins above, we can describe life’s deviation from the mean expectation of amino acids being 50% left and 50% right-handed amino acids after being subject to a natural racemization process (such as letting a 100% left handed set of amino acids sit in water for a time).

We can describe homochirality in terms of violations of expectation from the mean, but we can also express it in terms of (gasp) minimization of entropy.

From Stereochemistry, Vol 3 A Neuberger, L.L.M van Deenen, Ch. Tamm, page 60, Elsevier Biomedical Press, Amsterdam, New York, Oxford, 1982.

In an adiabatic isothermal isenthalpic adiabatic process, assuming no net catalysis present that favors one isomer over another, the entropy change from homochiral to the racemic state is:

\Delta S = R ln 2

where R is the Rydberg Constant and \Delta S is the change in entropy from the homochiral state to the racemic state.

Thus, there is a driving force for racemization of about -400 cal/mo at 25^\circ C. The entropy term for racemization is positive – the system becomes more random.

I presume “driving force” relates to the Gibbs free energy, where a negative change in Gibbs free energy indicates that a reaction will naturally go a certain way – in this case toward the racemic state. Where

\Delta G = -RT ln 2

Assuming an isenthalpic process where \Delta H = 0.

Of course this entropy increase from racemization can be prevented in a dissipative system in a non-equilibrium process such as found in organisms. The point is however, the homochiral configuration with minimal entropy is contrasted to the racemic configuration with maximal entropy that would emerge without a dissipative system like life to arrest the emergence of this entropy.

The point is NOT that the configuration is impossible, but that the configuration is at the extreme limit possible by physics and is quantifiable in by contrasting it in terms of entropy change that is overcome by a living organism. It should be added that a non-equilibrium system is not a sufficient condition to arrest the emergence of entropy from racemization any more than a non-equilibrium system of dynamite exploding a pile of rocks can build a building.

Thus, in contrast to prior attempts by the ID and Creationist community to frame design in terms of general applications of the 2nd Law or Information or Specified complexity, I claim there are only a few (and I emphasize FEW) limited examples where a structure can be characterized in terms of entropy and information. The above sort of analysis is the exception and NOT the rule.

Again, the point is that some notions of “perfection” can be quantified, if by “perfection” one means the highest possible “wow” that might be derived through the organization of physical objects (like molecules). This level of organization can be quantified in relation to what would happen under randomizing processes (such as racemization) in certain contexts.

Hi Sal,

You seem to be assuming that biological life can be the only cause of an imbalance. You seem to have missed the research published just this week that points to cosmic rays as the source of homochirality.

For your edification, I am posting the link which appears elsewhere on this forum.

I am not sure whether the published paper is behind a paywall. The pre-pub from last October is on ArXiv here:

Grace and peace,