Paul Nelson: Testing for Common Descent (Signal + Noise?)

I disagree. I think a a perfectly universal genetic code would be better evidence of UCD than a near-universal genetic code. Just because X is evidence for hypothesis Y, that doesn’t mean that not-X is evidence against hypothesis Y. Mutually exclusive observations can support a hypothesis to different degrees. The prediction would of course be modified by our understanding of the genetic code and how flexible it might be. If we know that the genetic code is in fact flexible then it might be more surprising if it was perfectly universal, given UCD and evolution.

UCD makes predictions in combination with other evidence, not in isolation. UCD + flexible genetic code makes different predictions to UCD + inflexible genetic code.

So phylogenetic bracketing is no better than hypotheses based on pure chance? Just because something is “logically congruent”, doesn’t mean it should be considered to be equally likely to all other possibilities. Sure, in that sense CD doesn’t make any absolute predictions about character state distributions, but it makes statistical predictions, as @swamidass said, especially in the case of nucleotide characters. This is the basis of phylogenetics. It’s certainly more informative than tossing a coin or rolling a die.

2 Likes

8 posts were merged into an existing topic: Side Comments on Nelson’s Signal + Noise

You can read a bit about the seminars here:

At least one of the regular readers of this site, whom I will not name (to protect his reputation), attended the seminar, and fits the description of “student” in my dialogue very closely. As Ann Gauger just explained at ENV, ID does not exclude UCD. If it did, I couldn’t have worked fruitfully alongside Mike Behe for 25 years.

ID is a minimal commitment to the detectability of design in nature. At the 2018 natural science seminar, we had agnostic and even atheistic students who nonetheless thought design detection was worth pursuing, for a variety of reasons (and said they would sort out the theological questions later).

4 Likes

Wow. So much about what people say–even to the point of doing an experiment in which the results are literally what people say–and virtually nothing about what we find when we actually study nature.

OK…

You are sorely mistaken. Who are you alleging made such a claim and when?

Don’t suppressor tRNAs modify codon assignments while simultaneously maintaining cell viability, by literally suppressing lethal mutations?

Wasn’t the amber suppressor discovered in 1960, and the hypothesis that suppressors could be mutations in the anticodon proposed (but not confirmed) in 1962?

Sorry, Paul, but we’ve known that bacteria can tolerate modifications to codon assignments that cause lots of proteins to “run on” for decades. Clearly there’s wiggle room.

This is yet another installment of “Learning biology is better done by learning biology instead of discussing what people actually or allegedly say.”

3 Likes

I think it’s fair to say that the challenges to modifications of the genetic code were used to justify the expectation that the genetic code would be mostly universal from the beginning (early 60s). The authors at the time certainly seem to have left the door open for variations on the standard genetic code though. This was before any were found.

In which case, perhaps it’s a strawman to say “UCD predicted perfect universality of the genetic code”, and more accurate to say “UCD predicted near-perfect universality of the genetic code”, speaking in the context of the 1960s and beyond.

Given what we know today, I think it’s very plausible that all of the variations to the genetic code we know about really are just modifications to the standard code, not indicative of some kind of separate ancestry.
For example, all known variants occur in single-celled organisms (including mitochondria), and I think we could all agree that however modifications to the genetic code might happen, this process would be easier in single-celled organisms. If we had families of mammals walking around with different genetic codes, that would be somewhat harder to explain.
There are also clear biases towards certain variants in different clades, indicative that some of these specific modifications might be selectively advantageous or the result of some underlying susceptibility to change - both cases would increase the evolvability of the variants. If all the variants to the genetic code seemed to be completely random/arbitrary, they might be harder to evolve.

3 Likes

To everyone: this thread has been moved to the Office Hours category, which changes the rules of participation. Sorry for the switch, but we can make this work.

Please direct questions to the Side Comments thread unless you have relevant expertise. Paul has been very good about addressing most questions so far, so I’m pretty sure he will get to the side comments too.

2 Likes

There seems to be a common equivocation between axiom and hypothesis in ID literature, especially that which has been directed at me about common descent.

1 Like

Yes, but since Crick already allowed for it, the discovery of reverse transcription should hardly be considered as a “modification to the axiom”, should it?

Not the kind of information described in the “axiom” through, so again I don’t think it would count as a modification. An addition, perhaps.

Since I never cited Crick, yes.

Again, I didn’t cite Crick as the source of the axiom. Infectivity clearly is information whether Crick thought of it or not.

So what is your source for the “axiom”, if not Crick and the central dogma?

I agree, although current theories of codon reassignment are rather messy, and experimental modifications of the code always involve the investigators “helping” the cells across the codon meaning shift (e.g., by supplying the wild-type tRNA on a plasmid during the transition).

What I’m trying to elicit is the theory-based rationale for saying “that could have evolved from LUCA” (majority opinion about extant variant codes) versus “that couldn’t have evolved from LUCA” (counterfactual variant codes that are sufficiently different to indicate separate ancestry). I worry that the threshold is (a) entirely conventional, and follows the data around, rather than being (b) a bona fide prediction grounded in theory.

If the former, (a), UCD isn’t really telling us anything about the world.

Probably Watson’s version. Does one really need a specific source to cite dogma?

As for the axiom that a prion couldn’t possibly catalyze the folding of its normal counterpart to impart infectivity information, I was training as a virologist when Prusiner first offered the hypothesis. Everyone around me, except for one student considering doing a postdoc with him, thought Prusiner was wrong.

Let’s not lose sight of the subject here: Paul Nelson is arguing that we can’t overcome axioms or dogma, when the reality is that we do so very often. Our resistance is easily broken down by empirical results, not rhetoric.

1 Like

Paul, why do you think I included “when” in my challenge?

In what way does something from 1963 contradict what I pointed out?

In 1963, we only knew of the nonsense suppression but not that its mechanism was a partial codon reassignment. Some people said such a mechanism was possible, others said it wasn’t. So what?

Once we had the mechanism of nonsense suppression in hand, which IIRC started to become clear in 1971, it was easy to accept missense codon reassignments.

I disagree vehemently. Here’s a good summary:


Comp Biochem Physiol B. 1993 Nov;106(3):489-94.
Evolutionary changes in the genetic code.
Jukes TH, Osawa S

  1. The genetic code was thought to be identical (“universal”) in all biological systems until 1981, when it was discovered that the coding system in mammalian mitochondria differed from the universal code in the use of codons AUA, UGA, AGA and AGG.

  2. Many other differences have since been discovered, some in mitochondria of various phyla, others in bacteria, ciliated protozoa, algae and yeasts.

  3. The original thesis that the code was universal and “frozen” depended on the precept that any mutational change in the code would be lethal, because it would produce widespread alterations in the amino acid sequences of proteins. Such changes would destroy protein function, and hence would be intolerable.

  4. The objection was “by-passed” by nature. It is possible for a codon to disappear from mRNA molecules, often as a result of directional mutation pressure in DNA: thus all UGA stop codons can be replaced by UAA.

  5. The missing UGA codon can then reappear when some UGG tryptophan codons mutate to UGA. The new UGA codons will be translated as tryptophan, as is the case in non-plant mitochondria and Mycoplasma. Therefore, no changes have taken place in the amino acid sequences of proteins.

  6. Variations of this procedure have occurred, affecting various codons, and discoveries are still being made. The findings illustrate the evolutionary interplay between tRNA, release factors and codon-anticodon pairing.


What aspect of this is so unacceptably messy to you that it renders it impossible or implausible? Keep in mind that there are multiple copies of tRNA genes, so there is ample opportunity for transitional partial reassignments.

3 Likes

Given that there are evidently different “versions”, I think so.

Right, I just don’t think it modifies the central dogma. Maybe I’m confusing your claim that “DNA->RNA->protein->catalysis” was an “axiom” in the 1970s with the central dogma. Rather than saying “the central dogma was modified”, are you making the more obvious claim that “what people thought was the case in the 1970s was modified”? If the latter, then I certainly agree.

1 Like

I don’t see how it’s a problem that the predictions that flow from UCD depend on other data. Given the angle of a cannon you can’t predict the trajectory of a cannonball - you also have to know the velocity that the cannonball will be shot at. The angle of the cannon tells you what trajectory to expect given the velocity of the cannonball. In the same sort of way, in this example, UCD tells you what pattern of genetic code variants that could exist given the evolvability of genetic code variants.

4 Likes

Yes, what people thought was the case was the dogma. I thought it was an obvious claim all along. :smiley:

This is an excellent analogy.

Paul’s concern:

strikes me as entirely inconsistent with how we do science.

It’s not that the threshold “follows the data around,” it is set based on the data we have at the time. That’s why thinking that the code must be invariant in 1963 is not surprising, but once we learn the mechanism of nonsense suppression 10 years later, we easily see a very plausible path to reassigning codons, so it’s no longer the threshold we thought it was in 1963.

This is pointed out in the abstract I quoted above:
The original thesis that the code was universal and “frozen” depended on the precept that any mutational change in the code would be lethal…

Nonsense suppressors are changes in the code. They are not lethal. Therefore the precept (aka dogma or axiom) was wrong.

1 Like

Which is another way of saying the UCD/~UCD threshold follows the data around.

I don’t have time today to dig into the details of the Jukes-Osawa scenario, but it’s missing a few steps. For instance, if one consider the variant code present in Tetrahymena (single stop codon UGA, UAA and UAG assigned to glutamine), one finds this:

“The model outlined by Jukes and Osawa…would lead to a potentially awkward intermediate stage where some genes end in [UAA and UAG], but neither [UAA or UAG] recognizing tRNAs nor…release factors exist in the cell. The outcome of this state in eukaryotes is not known, but in eubacteria the cognate tRNA of the penultimate codon remains covalently attached to the carboxyl-terminus of the protein.”

From here (P.J. Keeling exchange with Jukes & Osawa) UAR Codons for Glutamine | Journal of Molecular Evolution

“…during the appearance of code deviation, ancient termination codons are acquiring a new sense and new UAA and UAG codons are accumulating in the reading frames. This will generate ambiguity in the length of termination products.”

From here: Why does the genetic code deviate so easily in ciliates? - PubMed

Of course there’s much more recent work on code evolution that we could dig into, but that’s really a separate thread. The scenarios are messy and assumption-laden. That’s OK – science is messy and assumption-laden. One’s opinion of the plausibility of the existing code evolution stories will depend heavily on the priors one holds. Maybe after I finish with parts 2 and 3 of my reply in this thread, we could start a new discussion of genetic code evolution. Gotta run now, sorry.

@T_aquaticus

THIS is perfectly stated!