Sanford and Carter's Genetic Entropy Revisited

RNAs may be used in ways we don’t yet appreciate.

If the epigenome wasn’t tough enough to understand, the NIH has on and off considered a 200 million dollar research program on the EPITRANSCRIPTOME.

https://dpcpsi.nih.gov/sites/default/files/council%20jan%2030%202015%20Pres%20E4.pdf

This obviously has relevance to the claims by ENCODE of large scale transcription. This would suggest some uses of RNAs that are transcribed.

But more on that in the next comment.

Transcribed RNAs may faciliate a kind of neural network cellular computation:

https://www.sciencedirect.com/science/article/abs/pii/S0022519318304466?via%3Dihub

The evolution of the genome has led to very sophisticated and complex regulation. Because of the abundance of non-coding RNA (ncRNA) in the cell, different species will promiscuously associate with each other, suggesting collective dynamics similar to artificial neural networks. A simple mechanism is proposed allowing ncRNA to perform computations equivalent to neural network algorithms such as Boltzmann machines and the Hopfield model. The quantities analogous to the neural couplings are the equilibrium constants between different RNA species. The relatively rapid equilibration of RNA binding and unbinding is regulated by a slower process that degrades and creates new RNA. The model requires that the creation rate for each species be an increasing function of the ratio of total to unbound RNA. Similar mechanisms have already been found to exist experimentally for ncRNA regulation. With the overall concentration of RNA regulated, equilibrium constants can be chosen to store many different patterns, or many different input–output relations.

Non-coding RNA could be performing computations in ways similar to AI systems.

• Each RNA species is analogous to a neuron, interactions to connection strengths.

• Instead of genes controlled by on-off switches, they may be regulated collectively.

•Like neural nets, the computation is fault tolerant allowing for high mutation rates.

•This challenges the claim that such RNA is junk simply because it mutates rapidly.

If this computation is real, this would also be further enhances by the epitranscriptomic changes described in the previous comment.

If this is true, this would strengthen the case that ENOCDE is right, and evolution is wrong.

@Faizal_Ali: … and here’s Larry Moran’s blunter assessment of what Kellis et al. are really saying:

Dan Graur’s assessment is blunter but less detailed . I will not link to it here to avoid exposing myself to legal jeopardy.

2 Likes

Sure, but is there any reason to think this is one such example?

Just need ‘machine learning’ and ‘quantum biology’ to fill out my jargon Bingo card…

2 Likes

References, please. I cannot recollect, off the top of my head, anyone claiming that RNA is junk because it mutates rapidly.

Hi, that was the claim of the paper.

The problem of Slightly Deleterious Mutations is moot under the conditions exceeding the Muller limit of about 1 bad mutation per person.

Nachman, Crowell, Eyre-Walker, Keightly, Kimura refer to this very simple formula based on the Poisson distribution:

N = \Large e^{U}

where
N average number of offspring per person needed to ensure at least one offspring is free of mutation

U = average number of new bad mutations per offspring per generation

Hence, Graur’s figure on the order of 10^{35} children per female can be approximately arrived at in this way:

Graur said:

https://arxiv.org/ftp/arxiv/papers/1601/1601.06047.pdf

If 80% of the genome is functional, as trumpeted by ENCODE Project Consortium (2012), then 45-82 deleterious mutations arise per generation. For the human population to maintain its current population size under these conditions, each of us should have on average 3 × 1019 to 5 × 1035 (30,000,000,000,000,000,000 to 500,000,000,000,000,000,000,000,000,000,000,000) children. This is clearly bonkers.

The figure he has is slightly different than the one using N = \Large e^{U}

There is factor of 2 that is sometimes applied to N for the average number of offspring per parent , vs the number of offspring per couple, and the number of offspring per couple is also the number of offspring per female. Literature is not always clear about this.

Since Graur uses the word “bonkers”, I call N = \Large e^{U} the Bonkers formula.

But, if we use even a ballpark of Graur’s preferred idea the genome is around 10% functional, using the Bonker’s formula with U = 10, then N \approx 22,000 and 2N \approx 44,000 per female, which is only less bonkers than 10^{35}

I show how the Bonkers formula can be derived from the Poisson distribution here:

Joe Felsenstein stated in different terms the issue over ENCODE this way in Theoretical Evolutionary Genetics, page 161-162.

The mutational load calculation continues to be relevant to understanding whether
most eukaryotic DNA has any function that is visible to natural selection. Recent announcements(Encode Project Consortium, 2012) that 80% of human DNA is “functional”, based on finding some transcription or binding of transcription factors in it, are very misleading. Junk DNA is still junk DNA, however often its demise has been announced.

Dan Graur was more blunt:

If ENCODE is right, evolution is wrong

But ENCODE wasn’t right, was it? You should stop trotting out that quote.

3 Likes

No it wasn’t. Did you read the paper?

2 Likes

It could be, but it could also not be.

• Each RNA species is analogous to a neuron, interactions to connection strengths.

A mere assertion.

• Instead of genes controlled by on-off switches, they may be regulated collectively.

They may be, but the may also not be.

•Like neural nets, the computation is fault tolerant allowing for high mutation rates.

Another assertion. It’s of course also ridiculous to imply that an RNA can remain functional regardless of sequence, which practically ignores all of physical chemistry.

Heck, it even contradicts the claim creationists make the functions are rare in sequence space. Yet here you seem to be saying ncRNA can remain functional basically no matter what sequence is has.

•This challenges the claim that such RNA is junk simply because it mutates rapidly.

No a list of assertions and maybes don’t constitute a challenge in any meaningful sense of the word. For that you need real evidence of function. Not imaginary scenarios and maybe and could be. You whole post is basically one of those “just so” stories creationists like to rail against all the time.

If this computation is real, this would also be further enhances by the epitranscriptomic changes described in the previous comment.

Another assertion.

If this is true, this would strengthen the case that ENOCDE is right, and evolution is wrong.

If it is true. But there’s no evidence that it is, and lots of evidence ENCODE is wrong.

Hmm…

The neural network cellular computation can be used to implement a machine learning approach that will fully develop the field of quantum biology, all while unifying gravitation with Fisher’s Fundamental Theorem.

Can I get a BINGO! ?

1 Like

Art, it was from the highlights section [quote=“Art, post:73, topic:9593”]
No it wasn’t. Did you read the paper?

It was from the highlights section of the link. If that was by the editor or publisher, and not the author, then the issue is on them, not me.

I considerations from 4D nucleome and E4 Epitranscriptome and other reasons ENCODE is right, but even granting they’re half right, the bonkers formula is still bad for evoltuion with a half-right ENCODE.

No, the fact that the human population is not rapidly decreasing shows, whatever the percentage of junk DNA, that the formula’s assumptions are wrong. Have you thought for one second about what we would observe if its implications were true?

Those have been considered and there’s no evidence in them that ENCODE is right about their initial claim, which they’ve also walked back since then.

The results obtained by ENCODE can be reliably reproduced with random and nonfunctional DNA. It too will consistently recruit the transcription-initiation complex and produce RNA transcripts as if it contained functional genes. Yet it does not.

2 Likes

Technology and agriculture over compensate pretty well for genetic deterioration. But, I cited Crabtree whom even Michael Lynch cites favroable, and then I cited these statistics:

  • Approximately 7,000 rare disorders are known to exist and new ones are discovered each year
  • Rare disease affects between 25-30 million people in the United States and approximately 30 million people in the European Union
  • One in 10 Americans is living with a rare disease
  • Children represent the vast majority of those afflicted with rare disease
  • Approximately 80 percent of rare diseases are not acquired; they are inherited. They are caused by mutations or defects in genes
  • In the United States, rare diseases are defined as those affecting 200,000 or fewer people or about 1 per 1,000

But those aren’t even including the slightly function-compromising mutations like say those that could be happening in repetitive regions like D4Z4 with a margin of safety/redundancy of 10, where it doesn’t matter too much until it’s too late and someone gets Muscular Dystrophy.

But do you want to go on the record for stating you think something as complex as a human could be coded by 80 megabytes? That’s about what we’re talking about if one inovkes the 10% figure of functionality, even less if one invokes 2% . My cell phone has 100 to 1000 times more memory than that!

You are there confusing evidence for the mere existence of deleterious mutations in the human population, with evidence for a RAPID DECREASE in population fitness that means we are about to go extinct “Soon™”.

1 Like

Here is where we point out we have sequenced genomes from Denisovans, Neanderthals, and archaic Homo sapiens going back several hundred thousand years. That’s well before Sal’s young Earth was even created yet we see no evidence of “genomic degradation”.

Sal will ignore the evidence as he always does. :slightly_smiling_face:

1 Like