Continuing the discussion from Explaining the shape of a typical COVID 19 epidemic curve:
An important technical question came up about Sanford’s work I would like to resolve. I’m sending him an email pointing here. (Click the arrows below to expand details)
Does Genetic Information Loss Explain COVID-19 Death Curves?
We do have alternate explanations that explain this curve and much more without appealing to genetic entropy. But we also doubt that genetic entropy is attenuating the COVID virus.
It’s not my work, it’s Sanford’s.
You will find answers to most of your questions in the mat&meth section of his article below:
https://www.worldscientific.com/doi/pdf/10.1142/9789814508728_0015
The Paper In Question
Information Loss: Potential for Accelerating Natural Genetic Attenuation of RNA Viruses
by Wesley H. Brewer, Franzine D. Smith, and John C. Sanford
Loss of information is not always bad. In this paper, we investigate the potential for accelerating the genetic degeneration of RNA viruses as a means for slowing/containing pandemics. It has previously been shown that RNA viruses are vulnerable to lethal mutagenesis (the concept of inducing mutational degeneration in a given pathogen). This has led to the use of lethal mutagenesis as a clinical treatment for eradicating RNA virus from a given infected patient. The present study uses numerical simulation to explore the concept of accelerated mutagenesis as a way to enhance natural genetic attenuation of RNA viral strains at the epidemiological level. This concept is potentially relevant to improved management of pandemics, and may be applicable in certain instances where eradication of certain diseases is sought.
We propose that mutation accumulation is a major factor in the natural attenuation of pathogenic strains of RNA viruses, and that this may contribute to the disappearance of old pathogenic strains and natural cessation of pandemics. We use a numerical simulation program, Mendel’s Accountant, to support this model and determine the primary factors that can enhance such degeneration. Our experiments suggest that natural genetic attenuation can be greatly enhanced by implementing three practices. (1) Strategic use of antiviral pharmaceuticals that increase RNA mutagenesis. (2) Improved hygiene to reduce inoculum levels and hence increase genetic bottlenecking. (3) Strategic use of broad-spectrum vaccines that induce partial immunity. In combination, these three practices should profoundly accelerate loss of biological information (attenuation) in RNA viruses
https://www.worldscientific.com/doi/10.1142/9789814508728_0015
And @glipsnort’s analysis with emphasis added:
Let’s look at Sanford’s numbers. A key bit of his paper is this:
“We model 10% of all mutations as being perfectly neutral, with the remainder of mutations being 99% deleterious and 1% beneficial [35] […] We use the well-accepted Weibull distribution for mutation effects (a natural, exponential-type distribution [26]). In this type of distribution, low-impact mutations are much more abundant than high-impact mutations.”
Reference 35 is to this paper. In that paper, the authors report on 91 synthetic mutants of an RNA virus. They found 24 produced no virus and can be presumed lethal. If we ignore those, 31 (46%) had no statistically significant effect on fitness, 32 (48%) were deleterious, and 4 (6%) were beneficial. If we assume (without justification) that all mutants with lower measured fitness were deleterious, even if the value was not statistically significant, we have 76% deleterious and 18% neutral. They also found that the distribution of fitness effects detectably non-lethal deleterious mutations had a longer tail – more highly deleterious mutations – than could be fit well with an exponential-type distribution. Mean beneficial effect was 1%, mean/median deleterious effect (for non-lethals) was -13.9%/-9.2%
So let’s compare the Sanford model with the empirical results from the source they cite. Fraction neutral: 10% (model) vs 18-34% (empirical); fraction beneficial: 0.9% (model) vs 4-6% (empirical); fraction deleterious: 89% (model) vs 46-76% (empirical); size of beneficial effect: maximum of 1% (model) vs mean of 1% (empirical). Sanford’s paper doesn’t give the mean fitness effect of their deleterious mutations, but does report that 10% had an effect > 10%. The empirical median value for deleterious mutations (assuming all of the non-significant ones were actually deleterious) is just under 10% (9.2%), which means that Sanford’s distribution is weighted far more toward mutations of small effect than the empirical values, i.e. weighted toward mutations that can accumulate rather than be purged quickly by selection.
In summary, Sanford and colleagues took (and cited) a study of empirical values for the very thing they’re modeling, and then discarded every single one of the values and replaced it with one that suited their thesis. I’m not going to bother looking at the rest of their model, and I think I’ll skip characterizing the quality of this effort so I don’t get banned from PS.
I will be checking for myself if this is a good assessment, and I will also ask Dr. Sanford for this thoughts. I have three specific questions for him:
-
Responding to @glipsnort, how does he reconcile his choice of the parameters in his study with reference 35?
-
Does he believe that SARS-CoV-2 is being attenuated by the information loss mechanism he explains in this paper? What evidence can he point to that shows this is the case?
-
How did this new virus come to be? Why is it so effective if new functions cannot evolve? If evolution can only degrade functions, and it does so rapidly, shouldn’t this virus have gone extinct thousands of years ago?