This sounds fair enough…
One way to check would be to do a random sample of Bio-molecules in an Organism and classify how many are -
a) Two protein which are bound up neutrally… one not contributing in a significant way (positively or negatively) to the function of the main protein which does most of the work.
b) Two proteins in which one of the proteins does most of the work, but needs the second protein to function.
The frquency of such molecules in such a sample will provide good evidence of how “regularly” all this happens…
I am not even sure people can definitely say whether a group of proteins is type a, or type b…
For my own understanding, is the hypothesis of CNE that the neutral mutations which accumulate the required gene code to create the parts (proteins, and RNA) - which will eventually allow for the construction of the spliceosome - not doing anything until all the pieces are in place and the machine is ready for useful action within the cell?
No, generally the idea is that there is a mechanistic tendency for complexity to increase neutrally (the system doesn’t necessarily end up better, just more complex). This is thought to be facilitated by certain inherent propensities toward specific types of mutations, such as deleterious point mutations and compensatory gene-duplications.
Generally speaking since the tendency for deleterious mutations and gene duplications are high, genes that slowly accumulate deleterious mutations will be compensated for by having increasing numbers of genes, effectively masking the effect of lower-performance expressed genes, by increased gene-dosage effects. In this way the genomic and functional complexity might increase, while the overall system retains a similar level of fitness, or related measure of system performance.
This can happen both to enzymatic pathways, and to molecular machines. With respect to enzymes, as deleterious mutations of small effect might accumulate in the genes encoding the enzymes, duplications of these genes can compensate for the lower performance of the individual enzyme by literally having more copies of lower performance enzymes able to do a similar job(an analogy is two one-armed men doing the work of a two-armed man).
In molecular machines, as the individual protein components of the system degrade due to deleterious mutations, more and more additional proteins are needed to compensate for their lower degree of function, be that structural stability, effective docking spots for other proteins, etc.
That’s basically the gist of it. That inherent mutational tendencies of high probability (deleterious mutations of small effect, together with gene duplications) work together as balancing forces that result in a sort of increasing functional bricolage. Genomic complexity increases(the constructive part), even as the overall measure of fitness remains more or less the same (the neutral part). Constructive neutral evolution.
This can even potentially result in the evolution of novel functions. it is possible that we can get new functions and more complexity through a process that adaptively “breaks” or “degrades” many more genes than it “creates” or “enhances”!
There’s nothing logically problematic about that. Like this(I hope this is comprehensible, tossed together fast in mspaint):
Squares represent genes, colors and intensity represent functions and their degrees. Red rectangles highlight what is being duplicated and passed on.
This is “adaptive devolution” of increased complexity, and new functions, by mostly “degrading” and mostly “breaking” genes. Because these extra genes are costly to express, their death is adaptive, and so is the eventual deletion of them. But because the still functional copies continue to accumulate deleterious mutations, as these are are more frequent than beneficial ones, their duplication is also some times adaptive(more expressed genes compensates for each individual gene being weaker).
Eventually a previously dead gene locus, a black square (effectively having become non-coding DNA) evolves into a de novo protein coding gene. So one new function is evolved and enhanced, while all the rest degrades and breaks. The net result is more complexity and more functions than there was to begin with. And it happened almost exclusively through neutral and adaptive degeneration.
Thank you Rumraket, this is a very helpful explanation! This makes sense to me, and I think it is a strong counterargument against the ID conception of IC. I’m curious what (if any) responses Behe has made on this.
Is it generally true that the more complex an organism, the more parts makeup it’s molecular machinery? For instance, the Ribosomes inside a starfish or a sponge, has fewer components than the Ribosome in a crayfish, insects, etc?
The only part I’m not sure I follow, is how dead genes (now junk) can form entirely new functions. I’d love to see the details on how this is envisioned.
I don’t know specifically whether sponge or starfish ribosomes are more or less complex than in crayfish or insects, but I think I remember a talk by Loren Williams who stated ribosomes in eukaryotes are generally considerably more complex than in prokaryotes, and they appear to be at least among the most complex in mammals IIRC. I think it was in this talk: “RNA and Protein: A match made in the Hadean” presented by Loren Williams
From what I have read, there is a very similar gene count among multicellular animals with most differing by less than 2 fold. The single celled eukaryote Trichomonas vaginalis has 2 to 3 times more genes than humans.
It is worth remembering that all living species are the end product of 3.5 billion years of evolution. All species are equally evolved. It’s not as if less complex eukaryotes simply stopped evolving 100’s of millions of years ago.
Yeah, that makes sense. So you would not necessarily expect to see more components in the molecular machinery of a more advanced organism.
Of course, if ID theory is correct, that evolutionary history could be less for some phyla. I get the sense that we are still really just scratching the surface on how biology works, in developmental and epigenetics in particular.
I think an argument could be made for an expected difference in gene count between prokaryotes and more complex eukaryotes, but even then we would have to take selection for genome effeciency into account. Unless there is some selection pressure against increasing genome size and gene count then I don’t see why we would expect a correlation between physical complexity and genome complexity. Evolution is blind to the genetics of a phenotype, so if a complicated Rube Goldberg system works then it will be selected for. As @Rumraket discusses so eloquently, there can even be arguments made for the evolution of complexity through neutral evolution.
For some reason I didn’t notice this last part of your post earlier. There was a thread devoted to a recent review article discussing what is currently known about mechanisms of so-called de novo gene evolution here:
It’s not that there aren’t aspects of the subject for which that isn’t true, but I’m weary that a two-word summary like that is bound to serve as a means for some to dismiss the entire subject as nothing but unsubstantiated speculation, so to those who would take it like that, I wouldn’t really agree with it. One has to say more than just that to do it any justice.
There are many genes for which we don’t know how they originated, and it’s hard to say which of the numerous known mechanisms by which novel protein coding genes can evolve, that have historically predominated.
But we certainly know enough already to be able to say with substantial confidence that it is a real phenomenon, that novel protein coding genes can originate by characterized evolutionary mechanisms, and that this has contributed many, many novel proteins over the history of life on Earth.
That’s fair. I can see “highly theoretical” being a somewhat loaded phrase. Perhaps it would be more accurate to simply say there is still a fair amount of speculation going on.
To be clear, I’m not trying to dismiss anything, I care about the truth regardless of the implications. I just have not been particularly impressed with some of the sources provided. They seem far from certain and include a lot of vague qualifying language.