Retro transpositions would surely be related by common descent. And this doesn’t explain why you think that the non-conserved parts fit outside the hierarchy. As others have explained there is simply a limit on how far back they can be traced.
The nested hierarchy is a pattern in both the changed and unchanged parts. If nothing was ever conserved at all there would be no nested hierarchy, because then there’d be nothing to tell you that a thing showed any sort of affinity to one thing over another (no objectively measurable basis for forming groups). There has to be both similarities and differences for a nested hierarchy to be possible, and some things have to be more similar to each other, and more dissimilar to other things.
I think the way you say this is distorting the meaning of “conserved” and may further confuse @Giltil, if that’s possible. To me it means “evolving at slower than the neutral rate due to purifying selection at some sites”. It doesn’t mean “unchanged” or even “largely unchanged”. Neutrally evolving, non-conserved sequences still show that nested hierarchy for branching many millions of years into the past. Nor is it the degree of similarity that makes the nested hierarchy; it’s a particular pattern of similarities and differences.
Take your intron example. If we align homologous introns from say mouse and human, wouldn’t possible to determine whether or not they evolved at neutral rate, assuming they had a common ancestor x years ago?
Yes in my response I used conserved to mean the part that is completely unchanged. So for a DNA sequence that has a mutation in it, the mutation would be a difference and the rest of it would be “conserved”. But a DNA sequence with mutations can still be considered conserved when we are speaking in terms of mutation rates below the neutral rate.
Sure. I can’t seem to stop speaking in terms of degree of similarity even when I know that something A being more similar (in terms of percent identity for a string of DNA, say) to another thing B, than it is to C, doesn’t mean A is necessarily more closely related to B than A to C, as the total probability of the changes from A to B could be lower than the probability of changes from A to C even though there might be more changes in total between A-C than A-B. And so for a phylogeny derived on the basis of the probability of changes between sequences A, B, and C the relationship ((A,C)B) might be preferred over ((A,B)C).
I suppose there just isn’t a good term to use for that. We could say relatedness, but if people don’t know what we mean by that either it doesn’t clarify anything. I guess we just have to do the hard work of detailing all this stuff every time.
Or, to make it clear that you are talking about a particular concept, you could make up a new word. I propose “exsquisementary” for this concept (which I don’t entirely understand) because I stayed up too late, and am in a silly mood.
I realize this adds little too the discussion. Forgive me.
Yes, it is easy to imagine that a proportion of functional sequences are important for the proper choreography of events (spacio temporality of events) required for building complex organisms, a bit like dominoes falling.
Allow me to interrupt you right there. What is easy or hard to imagine is highly subjective and hence utterly uninformative. This is why while it has its place in strictly social contexts, where intuition and instinct are the only things to satisfy, when investigating how nature actually works, it is of no consequence what anybody considers easy to imagine.
To be clear, there is no inherent problem with intuiting answers to scientific questions. There is no inherent problem in constructing toy models that four-year-olds can digest in an effort to build an understanding, as is often done when introducing the subject to students. The problem is with confusing the map for the territory, with mistaking the model for the thing itself. Feeling fuzzy for having accomplished imagining something is not the last thing one does, if at all, but the first. The next thing is deducing non-trivial consequences of the picture one constructed, i.e. novel predictions the data or its analysis so far has not shown. The thing after that is testing if they bear out, ready to reject the model in part or in full, should the data routinely fall outside of the predicted error margins.
In this conversation, I asked you the following question: What proportion of the genome is under neutral evolution and what evidence do we have that these genomic regions are really under neutral evolution ?
To which you answered as follow: About 90%, and the evidence is lack of conservation
To which I answered as follow: It seems to me that you are necessarily assuming common descent for these ~ 90% that lack conservation. So the question now is what evidence do we have for this assumption ?
Your answer: Well, of course the evidence is overwhelming and has been provided to you many times before. The greatest part comes from the nested hierarchical structure of the data.
My answer: But isn’t the case that the nested hierarchical structure of the data mostly concern the conserved parts, not the other parts ?
Your answer: No, it is not. The non-conserved parts retain their hierarchical structure for many millions of years
My answer: Ok. But do we know what proportion of the non-conserved parts still exhibiting a hierarchical structure evolved at a truly neutral rate?
Your answer: We can’t know, but it’s the way to bet. This is you again demanding certainty and being unable to comprehend any sort of probabilistic claims
So here is my question. If we can’t know what proportion of the non-conserved parts still exhibiting a hierarchical structure evolved at a truly neutral rate, how can you affirm with such confidence that 90% of the human genome is junk?
Depends on what you mean by know I guess, but the evidence for it evolving at a neutral rate is the degree of lack of conservation. The rate at which it accumulates mutations over geological time appears consistent with observed rates of mutation before most selection
There’s other forms of evidence for it being junk, such as the nature of the sequences that make it up. It’s mostly things that appear to be broken versions of things that once were functional “selfish” genetic elements. Dead copies of GAG, POL, and ENV, degrading LTRs, lots of tandem copies of broken pseudogenes, highly repetitive DNA(which has this tendency to get duplicated or deleted) in various stages of disrepair etc. etc.
The kinds of things that have an intrinsic capacity to get copied and expanded, or deleted, or get silenced(which much of it is), without requiring that any of it actually carries out any important biological functions. It varies massively in size even between closely related and ostensibly similarly complex species.
This totality of evidence is best explained by it being junk. You really should read Larry Moran’s book.