Sanford and Carter's Genetic Entropy Revisited

As always, @Giltil provides a case example of type of mentality required to accept the anti-scientific nonsense put out by the likes of Sanford and Carter. In this reqard, @Giltil, your contribution to this group is invaluable.

No it doesn’t, the dates are conflicted. And dates of rocks of non-organic origin is about as good as dating a dog buried in 65-million year old rock that only died 100 years ago. The proper dating method is to date the biological material if it is available. There is C14, amino racemization, collagen, oxidation dating to date biological material that’s dead. But anyway, that’s off topic.

Argument by buzzphrase again. You won’t engage in real argument on any of those points. The dates are conflicted only if you look at the creationist literature exclusively. Dates of “non-organic origin” are just fine if there is such a thing as stratigraphy. Nobody is burying dogs in 65-million year old rock. How would that even work? You won’t say. Your Gish gallop is pointless.

2 Likes

Not so, radiometric dating is sound. Here are some surveys.

Radiometric Dating - A Christian Perspective

A Radiometric Dating Resource List

1 Like

Ok, I started a thread on one aspect of radiometric stuff here:

The current coronavirus COVID-19 is making a mockery of Sanford’s linking of epidemic virulence to fitness. It is becoming apparent the much coronavirus success is due to contagion from asymptomatic individuals spreading the virus.

1 Like

Oh, you mean like the C14 Calibration curves using biological material from over a dozen independent dating proxies?

IntCal13 and Marine13 Radiocarbon Age Calibration Curves 0–50,000 Years cal BP

I’m sure honest Sal can explain how all those independent dating proxies agreed to closely going back to 50K YBP, some 44,000 years before Sal’s young Earth was created. :slightly_smiling_face:

1 Like

From

https://globalgenes.org/2009/02/27/rare-disease-facts-and-figures/

  • Approximately 7,000 rare disorders are known to exist and new ones are discovered each year
  • Rare disease affects between 25-30 million people in the United States and approximately 30 million people in the European Union
  • One in 10 Americans is living with a rare disease
  • Children represent the vast majority of those afflicted with rare disease
  • Approximately 80 percent of rare diseases are not acquired; they are inherited. They are caused by mutations or defects in genes
  • In the United States, rare diseases are defined as those affecting 200,000 or fewer people or about 1 per 1,000
  • Rare disease is often referred to as an “orphan” disease

NOTE: rare diseases may drift out or be selected out of the population, the problem is these statistics suggest that when something drifts out, other new bad mutations elsewhere in the genome come in. This is how mutation accumulation can happen even without fixation of bad mutation.

This was well known as far back as Nobel Prize winner Herman Muller’s time. Muller (of Muller’s ratchet fame), determined the number of bad mutations the human genome could tolerate, it’s on the order of about 1 bad mutation per offspring.

Since the majority of mutations are function compromising (at best) and since it is estimated that there are about 100 mutations per human offspring, there are about 100 bad mutations per human. Under these conditions it each human female would have to bear buzzillions of kids just to have one that is free of bad.

That’s why Graur advocates that most of the human genome is junk, because the alternative is evolution is wrong.

If ENCODE is right, to generate even kids without a bad mutation a human female would need to have 10^35 babies. Graur said that would be bonkers! His alternative is that the human genome is mostly junk. Ironically, even if we used Graur’s on estimate for the function genome (in the ball park of 10-15%), a human female would only need to make 44,000 babies. That surely sounds more feasible, relatively speaking.

What complete Creationist horse hockey. The large majority of mutations are effectively neutral. No mutation can be deemed “bad” on its own. Mutations can only be deemed deleterious, beneficial, or neutral WRT their effect on the individual’s reproductive fitness.

I’ve seen some really ridiculous YEC arguments but “every woman needs to have 10^35 babies” set some sort of record. :joy:

1 Like

Isn’t the alternative that we became extinct a long time ago? Or, at least, the population should be decreasing.

You are confused about the claims. First, even ENCODE people are no longer claiming that the original claim (80% or perhaps 100% functional DNA) is true. Second, it’s not just “bad mutations” but mutations summing to death of the average individual. It isn’t necessary for the average individual to have zero deleterious mutations in order for the population to maintain itself.

Creationists were excited by the 80% figure, and have kept repeating it and citing it. When ENCODE, under very valid criticism, lowered its figure to 40%, and later to 10-15%, somehow the creationists didn’t notice this. They continue to cite the figure as 80%.

2 Likes

Joe, do you happen to have some good references that show this backtrack trend? I’m not familiar with the latest on this dropping number.

1 Like

Yes, I need to provide links, but have to rush away from my keyboard right now. In the meantime look up posts on this by Larry Moran at Sandwalk, by Dan Graur at Judge Starling (Tumblr), and a paper by Manolis Kellis. I will get back to you on this.

2 Likes

Here’s the Kellis paper where they first started walking the 80% functionality claim back:

If there were further walk-backs in the primary literature, I didn’t notice them.

2 Likes

False.

1 Like

Yes. And what is wrong with that?

You have to read carefully, because they do their best to soft-pedal the backtrack, but backtrack they do:

Although ENCODE has expended considerable effort to ensure the reproducibility of detecting biochemical activity (99), it is not at all simple to establish what fraction of the biochemically annotated genome should be regarded as functional. The dynamic range of biochemical signals differs by one or more orders of magnitude for many assays, and the significance of the differing levels is not yet clear, particularly for lower levels. For example, RNA transcripts of some kind can be detected from ∼75% of the genome, but a significant portion of these are of low abundance (Fig. 2 and Fig. S2). For polyadenylated RNA, where it is possible to estimate abundance levels, 70% of the documented coverage is below approximately one transcript per cell (100103). The abundance of complex nonpolyadenylated RNAs and RNAs from subcellular fractions, which account for half of the total RNA coverage of the genome, is likely to be even lower, although their absolute quantification is not yet achieved. Some RNAs, such as lncRNAs, might be active at very low levels. Others might be expressed stochastically at higher levels in a small fraction of the cell population (104), have hitherto unappreciated architectural or regulatory functions, or simply be biological noise of various kinds. At present, we cannot distinguish which low-abundance transcripts are functional, especially for RNAs that lack the defining characteristics of known protein coding, structural, or regulatory RNAs. A priori , we should not expect the transcriptome to consist exclusively of functional RNAs. Zero tolerance for errant transcripts would come at high cost in the proofreading machinery needed to perfectly gate RNA polymerase and splicing activities, or to instantly eliminate spurious transcripts. In general, sequences encoding RNAs transcribed by noisy transcriptional machinery are expected to be less constrained, which is consistent with data shown here for very low abundance RNA (Fig. 3). Similarly, a majority of the genome shows reproducible evidence of one or more chromatin marks, but some marks are in much lower abundance, are preferentially associated with nonconserved heterochromatin regions (e.g., H3K9me3; Fig. 3 B ), or are known to act at a distance by spreading (105). Indeed, for any given biochemical assay, the proportion of the genome covered is highly dependent on the signal threshold set for the analysis (Fig. 2 and Fig. S2). Regions with higher signals generally exhibit higher levels of evolutionarily conservation (Fig. 3 and Fig. S3). Thus, one should have high confidence that the subset of the genome with large signals for RNA or chromatin signatures coupled with strong conservation is functional and will be supported by appropriate genetic tests. In contrast, the larger proportion of genome with reproducible but low biochemical signal strength and less evolutionary conservation is challenging to parse between specific functions and biological noise.

https://www.pnas.org/content/111/17/6131?fbclid=IwAR1SbglfrpCdAADACnV79IzekDLTg67CjuNH1Kftv0YyJJhs18zYSe8fxlw

3 Likes

From Sanford’s “Genetic Entropy”

This has to be one of the outlandish statements of deluded grandeur I have encountered. He has to believe that there is a groundswell of scientists furtively studying his book, but maintaining radio silence? It’s like physics professors who get sent letters from self professed geniuses refuting relativity. Tiresome. Straight to the round file. Biologists have work to do. Why would they waste their time on this?

I’m reminded of Glenn Close in Fatal Attraction - “I’m not gonna be ignored”. But this is really a message for the circuit. They have no reply because they are overwhelmed by the overwhelming evidence and logic by which their evolutionary paradigm is crushed. The silence is demonstration of my irrefutable insight.

If only there were a discussion board frequented by published PhD qualified molecular and computational biologists, population geneticists, and virologists willing to engage in discussions of origins and offer open and honest critical appraisals of relevant ideas.

Science can only be advanced by validated evidence, and this is exactly what is sorely contrived in Sanford’s volume.

hmmmmmm… might have happened. maybe once or so.

3 Likes

Michael Lynch cites Gerald Crabtree favorably.

This is from Gerald Crabtree at Howard Hughes:

https://www.cell.com/trends/genetics/pdf/S0168-9525(12)00159-X.pdf

Analysis of human mutation rates and the number of genes required for human intellectual and emotional fitness indicates that we are almost certainly losing these abilities. If so, how did we get them in the first place, and when did things begin to change?

Crabtree obviously believes this is a new trend and that we were evolving from some primate ancestor and gaining intellectual ability which he argues we’ve been losing over the last few thousand years.

Dr. Sanford and YLCs have a different view as to the details.

But, one thing I don’t see is anyone of any reputation claiming the human genome is getting better!

From another essay by Crabtree:

[PDF] Our Fragile Intellect | Semantic Scholar

I would be willing to wager that if an average citizen from Athens of 1000 BC were to appear suddenly among us, he or she would be among the brightest and most intellectually alive of our colleagues and companions. We would be surprised by our time-visitor’s memory, broad range of ideas and clear-sighted view of important issues. I would also guess that he or she would be among the most emotionally stable of our friends and colleagues. I do not mean to imply something special about this time in history or the location, but would also make this wager for the ancient inhabitants of Africa, Asia, India or the Americas of perhaps 2,000 to 6,000 years ago. I mean to say simply that we Homo sapiens may have changed as a species in the past several thousand years and will use 3000 years to emphasize the potential rapidity of change and to provide a basis for calculations, although dates between 2,000 and 6,000 years ago might suffice equally well.

Sometime back, Dr. Sanford assigned me the task of summarizing developments at ENCODE and the NIH. A neglected part of the discussion is the follow-on and related projects such as 4D Nucleome and the Origami Code.

The importance of these projects is perhaps easier to visualize than to explain solely in words. Notice the upper right hand part of the following diagram:

spatial1

Special segements of Chromosome X, Chromosome 2, Chromosme 15, Chromosome 17 are brought into 3D proximity so they can be co-regulated. This involves proteins attaching to regions of the DNA to make this possible. That means the DNA serves as both an addressing scheme and scaffold for these molecular machines.

ENOCDE advertised a lot about transcription, but ENCODE’s follow-on, 4D nucleome emphasizes the structural and mechanical role DNA serves.

The configurations of these little regulation factories change from cell type to cell type. . Failure for these factories to form correctly is implicated in disease.

John Rinn also determined, lncRNAs are important to forming these cell-type specific factories.

And this might only be the tip of the iceberg of DNA functionality.

I researched Alu functionality and Chris Rupe and John Sanford used some of my research material in their book. This overturned Ayala’s claim of bad design, btw: