Paul Giem: Isochron Dating Rocks and Magma Mixing

It is worth noting that the Mormon church funded research into Native American genetic ancestry in order to test their beliefs. That certainly wasn’t cheap. It also didn’t pan out the way they had hoped.

3 Likes

Who would have taught you this? No paleontologist would ever have said that. The majority of vertebrate fossils contain permineralized original bone.

Are you a young-earth creationist? There seems a bit of Gish gallop in your methods. I’m just picking out a bit that closer to my area of expertise.

2 Likes

None of that is “evidence”, only opinion and uncertainty. You are demonstrating that @T_aquaticus is correct - creationists are asserting that tissues cannot survive that long “without a shred of evidence”.

1 Like

That’s not quite what I meant. If the variance were infinite it wouldn’t be a useful measurement in any circumstance. If all the high (low) measurements were systematically thrown out that could be a problem, but then it should also be easy to detect with independent sampling.

I’m not tempted, that is exactly what I am saying; bias is limited. Large systematic errors are easy to detect, and total variance is (in exponential data) proportional to the mean.

If blinded results and all data published are the standard you want, then use that standard and others will follow. My (limited) understanding is that most geologists already send samples to independent labs for testing. I assume relevant expertise is applied to determine the suitability of samples. I can’t comment on that expertise but it (edit: the application of relevant expertise) would be typical of biomedical studies.

I don’t wish to sidetrack the discussion, but I can think of an example in the Creation Science literature where an author How to Criticize with Kindness: Philosopher Daniel Dennett on the Four Steps to Arguing Intelligently – The Marginalian been less than transparent about the data. What I mean is, there is room for improvement here, too.

2 Likes

That may be so, but the short chronologists need to stick to the same standards of factual accuracy and technical rigour as the long chronologists if they are to be taken seriously.

When they are making a lot of noise about levels of carbon-14 in fossils that are so low as to be indistinguishable from contamination, and then dismissing contamination as a “rescuing device,” it simply can not be said that they are doing so. To dismiss contamination as a “rescuing device” is to demand to be held to lower standards than everybody else. In fact, in many areas of science, if you dismissed contamination as a “rescuing device,” you would kill people.

I don’t think so.

The “creationists only do sloppy science because they are underfunded” may have sounded persuasive back in the 1970s and 1980s, but it lost every last shred of credibility with the RATE project. RATE was the most expensive, extensive and comprehensive research project ever conducted by YEC creation scientists. It ran for eight years with a budget of $1.25 million, and as such it was a chance for them to showcase what they could do with some serious money. If they really had a case for a young Earth, they should have come up with at least something compelling – or, at the very least, some highly promising leads. What did they give us in the end?

  • An admission that there really had been billions of years’ worth of nuclear decay at current rates since creation (something that they had always denied up to that point)
  • A science fiction scenario involving billion-fold accelerated nuclear decay during Noah’s Flood
  • An admission that if said accelerated nuclear decay had actually happened, it would have released enough heat to vaporise the Earth’s crust many times over with temperatures reaching in excess of 22,000°C
  • As evidence for said scenario, a set of tiny, ambiguous and outright fudged samples with huge error bars, and levels of radiocarbon indistinguishable from contamination.

All in all, that doesn’t come anywhere remotely close to $1.25 million worth of compelling scientific results.

And what’s happened since then? Answers in Genesis has spent more than $100 million building the Ark Encounter.

There are reasons why YECs can’t get high quality measurements to fit their timescale, but money isn’t one of them.

That may be true, but it was a belief that had never been put to the test. Nobody had ever done any measurements to demonstrate conclusively that that was the case. Without such measurements (including full error bars), you cannot claim that collagen and other proteins could not last that long.

11 Likes

Thanks for all the comments. They can help to make the issues clearer.

I will split up my answer into several parts. First, I will note that no-one seems to be defending isochron dating by insisting that it is “self-checking”. The existence of a precise mimic of isochrons in mixing lines seems to have put that issue to rest. Rather, the defense seems to be centered on the agreement between dates from the various methods (see my next comment), and there are also related comments that are not directly applicable to isochron dating itself, some of which I will address in other comments.

Dan_Eastwood, you say (C44),

That’s not quite what I meant. If the variance were infinite it wouldn’t be a useful measurement in any circumstance. If all the high (low) measurements were systematically thrown out that could be a problem, but then it should also be easy to detect with independent sampling.

That would be true only if the “independent” samplers don’t adhere to the same rules for throwing out “bad” samples.

Let me give you an example illustrating why I have trouble with limiting bias by using the variance: Suppose a very good archer, Alice, is shooting arrows out of a cave, across a field, and into a target. She shoots three arrows in rapid succession, without correcting for where they land. Unbeknownst to her, there is a crosswind in the field outside of her cave, so her arrows land 1.26, 1.24, and 1.25 m to the left of the target. This makes a very small spread of 2 cm in the arrows, but the precision does not assure that we have accurately located the target. Only if we know the relationship of the arrows to the target, can we correct for the bias caused by the wind.

If we then try to correct for bias by sending Bob, a not quite so good archer, to shoot at the target, and he is unaware of the wind, then his shots might land 1.27, 1.27, and 1.21 m from the target, as the wind is biasing him just as much. It would be naive to claim that we have located the target at a spot 1.25 +/- 0.93 m to the left of where it really is. That is why I don’t see how we can correct for bias without knowing what the bias is. That seems to require an unbiased measurement.

You say (C44),

If blinded results and all data published are the standard you want, then use that standard and others will follow. My (limited) understanding is that most geologists already send samples to independent labs for testing. I assume relevant expertise is applied to determine the suitability of samples. I can’t comment on that expertise but it (edit: the application of relevant expertise) would be typical of biomedical studies.

You are correct, at least some of the time (I have seen data published that was sent to only one lab). But that is not the potential problem I see with bias. Nor is the potential bias in the measurements themselves.

T_aquqticus asks (C39),

How would you unconsciously measure the wrong concentrations of these isotopes?

Here is the way one can unconsciously bias the data: One measures the relevant isotopes, and makes the relevant calculations, but as a check, one includes a presumably correct date based on, say, the geological stratum (or more precisely, the interpretation of the geological stratum). A date that does not match the presumably correct date is looked at carefully, and if one can find a flaw in the specimen one throws it out. If we are lucky, it may be reported as a variant, and if not, it may be omitted without comment. In the meantime, dates that match the standard are accepted and published prominently, without anyone noting whether they may have the same flaw.

Now looking for flaws is not unreasonable. For example, in medicine we know not to accept at face value a potassium determination on a hemolyzed specimen, as the potassium will be predictably too high. But if one is validating an assay, one tries it on non-hemolyzed specimens, and if the specimens do not look hemolyed, excusing their deviation from the expected value after the fact by claiming hemolysis is frowned upon. The exclusion should have been done before the measurement. Blinding and full reporting force one to treat all samples the same, and consilient results here are much more trustworthy.

The possible need for such care can be derived from some data, reported in my paper, where the isochron dates do not match anyone’s expectation. A few examples from that data follow:

Pleistocene to Recent (<1.6 million years old) lava with a Rb/Sr age of 773 million years (Bell K, Powell JL: “Strontium isotopic studies of alkalic rocks: The potassium-rich lavas of the Birunga and Toro-Ankole Regions, east and central Africa.” J Petrol 1969;10:536-72)

Pliocene to Holocene (<5.3 million years old) lava giving Rb/Sr ages of 570 and 870 million years (the 570 million year “isochron” is apparently from <3000 year old lava). Leeman WP, Manton WI: “Strontium isotopic composition of basaltic lavas from the Snake River Plain, southern Idaho.” Earth Planet Sci Lett 1971;11:420-34)

Pliocene to Holocene (<5.3 million years old) lava with a Rb/Sr age of 1.5 billion years (Leeman WP: “Late Cenozoic alkalirich basalt from the western Grand Canyon area, Utah and Arizona: Isotopic composition of strontium.” Bull Geol Soc Am 1974;85: 1691-6)

These data (and isochrons with negative slope) are usually simply written off as mixing lines, without any other attempt to prove that they are, or to understand how we can recognize such mixing lines by mineralogical or geological criteria, so as not to waste effort on other bad specimens.

If time and money permit, and someone else does not do it first, I do plan to lead by example. I have already done this in another area of radiometric dating, and gotten a set of dates published in Radiocarbon (Taylor RE, Beaumont WC, Southon J, Stronach D, Pickworth D: “Alternative explanations for anomalous 14C ages on human skeletons associated with the 612 BCE destruction of Nineveh”, Radiocarbon 2010 52(2-3):272-282–see especially the acknowledgments)

Finally (for this section), Dan_Eastwood says (C44),

I don’t wish to sidetrack the discussion, but I can think of an example in the Creation Science literature where an author [How to Criticize with Kindness: Philosopher Daniel Dennett on the Four Steps to Arguing Intelligently – Brain Pickings 1](https://www.brainpickings.org/2014/03/28/daniel-dennett-rapoport-rules-criticism/) been less than transparent about the data. What I mean is, there is room for improvement here, too.

I don’t want to sidetrack the discussion either, but I don’t see what data there were in the article not to be transparent about. Perhaps the complaint is that the author did not say that Daniel Dennett was an atheist philosopher and she was not an atheist? However, I can agree that creationists have not always been transparent, and that it is better to be transparent. (BTW, I agree with the main thrust of the article—It is a good idea to understand the person you disagree with and agree with all you can before offering your criticisms)

We now come to the question of coordination of different decay changes.

T_aquqticus comments (C39),

Again, if you are going to introduce a potential problem you still need to explain the consilience between methods.

T_aquqticus says earlier, (C39),

We still need an explanation of how Rb/Sr, U/Pb, and K/Ar decay chains could all be coordinated so that they give the same age for the same geologic layers.
That’s precisely what my paper tried to do (except for K/Ar dating). So critique the paper itself, and if the concepts are testable, test them.

Perhaps the easiest way one can see how the dates could be identical but misleading is to suppose that one starts out with 2 magmas, A and B. One ages them. One then mixes them in various proportions, and obtains “isochron” lines that are actually mixing lines. The mixing lines will have slopes that give the same date as the ages of the magmas for every isochron dating method, but the magma mixing could have happened within the year of the measurement, rather than at the date of isolation of the two magma bodies.

Of course, that is not a young earth creationist scenario. But the question we are raising is not, can we fit this into a young earth creation model, but rather, can we be sure that matching dates prove that radiometric dating is accurate?

Regarding uranium-lead dating, T_aquqticus notes (C39),

When zircons melt it resets the clock. Again, at the time of crystallization nearly all Pb is excluded while U is included. What tiny amount of Pb is present only throws off the youngest of ages by any appreciable amount.
and again, (C39),
Mixing doesn’t occur in zircons because they exclude Pb.

The argument would be valid if all zircons formed when the magma they are found in was formed, and if zircons never partially melted and lost part of their lead, and if the uranium-lead ages of zircons were always measured on single crystals. But if these conditions were true, then all zircons should be found on the concordia line, and most of the time they are not. That is, put bluntly, the U-238/Pb-106 dates do not match the U-235/Pb-207 dates most of the time. Well, there goes the claim that isochron ages usually match each other.

T_aquqticus (C39) raised the question of inherited argon:

How much non-radiogenic argon did the rocks have? Would it be enough to add hundreds of millions or billions of years to their actual age? How would the amount of inherited argon create ages that correlate with U/Pb and Rb/Sr dates from rocks in the same geologic layer?
It depends on the rock. Some argon in modern lavas matched air argon in Ar-36 content, thus dating to indistinguishable from zero if one subtracted the presumed air argon out. But some could date up to 44 million years old (Dalrymple GB, Moore JG: “Argon 40: Excess in submarine pillow basalts from Kilauea Volcano, Hawaii.” Science 1968;161:1132-5) and some presumed xenocrysts in lava could date up to 1 billion years old (Dalrymple GB: “40Ar/36Ar analysis of historic lava flows.” Earth Planet Sci Lett 1969;6:47-55, accurately citing Funkhouser JG: “The determination of a series of ages of a Hawaiian volcano by the potassium-argon method”. Univ of Hawaii Ph.D. thesis, 1966—I highly recommend reading the thesis). And some lavas actually had more argon than air argon has, giving them negative ages when run through the standard formula. I would like to know what the proportions of dates we get that are too old, of zero age, or that date in the future on randomly selected lavas. This would be one area for a blinded study with all results published.

Several have commented on the claim that it was once believed that dinosaur bone had its soft tissue completely replaced. For example, T_aquqticus said (C39),

PaulGiem: There is that “without a shred of evidence” again. If your memory is as long as mine, you may remember being taught that fossils were completely replaced by minerals because they were millions of years old.
I was never taught any such thing, and I highly doubt that this was ever the consensus position within paleontology.
And John_Harshman commented (C42)
Who would have taught you this? No paleontologist would ever have said that. The majority of vertebrate fossils contain permineralized original bone.
To be fair, I think that most paleontologists have agreed that if we find hydroxyapatite in bone, it is probably original material. And calcium carbonate in limestone is usually agreed to be original material.

But for the soft tissues, Smithsonian Magazine remembers a consensus. Speaking of Mary Schweitzer’s findings, they wrote (or more precisely Helen Fields wrote—see Dinosaur Shocker | Science| Smithsonian Magazine for details):

The finding amazed colleagues, who had never imagined that even a trace of still-soft dinosaur tissue could survive. After all, as any textbook will tell you, when an animal dies, soft tissues such as blood vessels, muscle and skin decay and disappear over time, while hard tissues like bone may gradually acquire minerals from the environment and become fossils

Now why would the consensus that decay happens too quickly to allow for dinosaur soft tissue develop? Probably because of some evidence. According to Buckley M, Collins MJ, 2011. Collagen survival and its use for species identification in Holocene-lower Pleistocene bone fragments from British archaeological and paleontological sites. Antiqua [on line at View of Collagen survival and its use for species identification in Holocene-lower Pleistocene bone fragments from British archaeological and paleontological sites ]

We have estimated the effective activation energy (Ea) of collagen loss from bone to be 173 kJ mol–1. Extrapolation from high temperature experimental decomposition rates using this activation energy suggest that at a constant 10°C (the approximate mean annual air temperature in present-day Britain) it will take between 0.2 and 0.7 Ma years at 10°C for levels of collagen to fall to 1% in an optimal burial environment.
That would suggest that the amount of collagen (a particularly resistant protein) from 65 million year old bone mostly stored at well above 10° C should be vanishingly small. So it is not unreasonable to suppose that protein could not last that long, at least absent data like Schweitzer’s.

Now, of course, one could always insist that since we had 65 million year old bone with soft tissue in it, it must be able to survive that long, and assertions to the contrary are misguided. But the assertion that

Creationists have not demonstrated that the tissues found in dinosaur bones could not survive for 65 million years. They simply assert that they can’t survive that long without a shred of evidence to support the assertion.
(T_aquaticus, C32, and others that have followed it, such as T_aquqticus C39, Roy C43, and jammycakes C45) is a bit triumphalistic. There is at least a shred of evidence.

Several have commented that there is plenty of money to do research, including T_aquqticus (C39), Mercer (C40), and jammycakes (C45),

Jammycakes has the most extensive quote:

… RATE … ran for eight years with a budget of $1.25 million, and as such it was a chance for them to showcase what they could do with some serious money. If they really had a case for a young Earth, they should have come up with at least *something* compelling – or, at the very least, some highly promising leads. What did they give us in the end?
  • An admission that there really had been billions of years’ worth of nuclear decay at current rates since creation (something that they had always denied up to that point)
  • A science fiction scenario involving billion-fold accelerated nuclear decay during Noah’s Flood
  • An admission that if said accelerated nuclear decay had actually happened, it would have released enough heat to vaporise the Earth’s crust many times over with temperatures reaching in excess of 22,000°C
  • As evidence for said scenario, a set of tiny, ambiguous and outright fudged samples with huge error bars, and levels of radiocarbon indistinguishable from contamination.

All in all, that doesn’t come anywhere remotely close to $1.25 million worth of compelling scientific results.

And what’s happened since then? Answers in Genesis has spent more than $100 million building the Ark Encounter….

That’s a jaundiced but not unreasonable summary for most of it, although I think the RATE group’s helium data is being dismissed a bit too abruptly (my own criticism of the helium research is that it involves only one site, and that I would like to see more sites with matching data before I put too much weight on it, and I am told that they are working on that). They also documented radiometric dates that did not match each other, which is relevant to our discussion. Whether this state of affairs is usual remains to be determined.

And I agree that there is money out there, and that in one sense the real problem is not so much money as vision. Or put more precisely, the people with the vision do not have the money, and vice versa. (It’s far easier to raise $100 million for an ark at which one can charge admission, than to raise ~$2 million for a radiocarbon lab that will probably never pay for itself financially.) There may even be some who are afraid that the same thing can happen to them that happened to the Mormons, as T_aquqticus (C41) noted.

But according to the conventional scientists most closely acquainted with the data, the radiocarbon levels found were definitely distinguishable from laboratory contamination. Kirk Bertsche ( RATE’s Radiocarbon: Intrinsic or Contamination? ) has stated,

While this conclusion [laboratory contamination] explains the higher values for the biological samples in general, it does not account for all the details. Some biological samples *do* have radiocarbon levels not explainable by sample chemistry. These samples are mostly coals and biological carbonates ….

and

Unlike the literature values, Baumgardner’s coal samples *do* show significant radiocarbon above background, inviting explanation. (italics his)

He blames the carbon-14 found on “ in situ contamination”, but he at least agrees that it is not just laboratory error.

And Harry Gove, as summarized by Kathleen Hunt ( Carbon-14 in Coal Deposits ), stated:

The short version: the 14C in coal is probably produced de novo by radioactive decay of the uranium-thorium isotope series that is naturally found in rocks (and which is found in varying concentrations in different rocks, hence the variation in 14C content in different coals). Research is ongoing at this very moment.

Note that the two explanations of the data are not identical, although I suppose one could try a combination of them. The reason for pointing this out is not so much to argue who is right (that would take another post, if not more experiments) as it is to point out that the experts do not believe the laboratory contamination hypothesis. It would appear that the laboratory contamination hypothesis is in fact functioning as a rescuing device.

It would be scientific fraud if one did so without explicitly stating that.

Omitting data in that scenario also constitutes fraud IMO.

Whom, specifically, are you accusing of committing such frauds?

I think that a lack of transparency is a gross exaggeration.

Would you like an example from biology?

3 Likes

Have you seen Kevin Henke’s analysis of the RATE project’s study on helium diffusion in zircons? He doesn’t just “dismiss it abruptly.” He goes into specific details about exactly what standards of quality control every geological study of this nature needs to meet and points out about sixty different ways in which the helium study fails to meet those standards. While I personally disagree with some of the things he says (he makes a bit too much of the “introducing religious presuppositions in science” angle) the bulk of his critique still stands, and there are simply too many flaws and even outright fudging for anyone to be able to take it seriously.

Here’s the thing. Accelerated nuclear decay on the scale that the RATE project is claiming is an extraordinary claim that would win a Nobel Prize hands down if it actually had any merit. This being the case, it is essential that it be scrutinised ruthlessly and it needs to be held to the highest standards of rigour, accuracy, attention to detail, and quality control possible. It needs to be replicated by other studies, not just by the same team, but by other independent researchers as well using multiple different methodologies. Anything less, and you would also be giving a free pass to astrology, homeopathy, feng shui, reading tea leaves, anti-vaxxers, covid denial, and tobacco companies claiming that smoking is good for you.

Besides, 22,000°C of unexplained heat? Come on, give me a break. I’m sorry, but that’s just degenerating into the realms of fantasy.

They blow the discrepancies completely out of proportion.

All they have been able to demonstrate is that a minority of radiometric dates can be out by about 15-20% of the half-lives of the isotopes concerned. This is no surprise to anyone. But it falls far, far, far, far, far, far, far, far short of demonstrating that all radiometric methods are so badly out of whack that they consistently fail to tell the difference between thousands and billions.

There is a difference between “doesn’t always work” and “never works.” There is also a difference between “occasionally out by a few percent” and “consistently out by a factor of a million.”

Well of course it will be a combination of them. That’s simply how measurement works in every area of science. You don’t just dismiss one source of error as a “rescuing device” just because it can’t account for the discrepancy in isolation. You have to add all the possible sources of error up together.

Seriously, by insisting that laboratory contamination is a “rescuing device” just because there are other forms of contamination involved as well is to insist that the basic rules and principles of accurate and honest weights and measures don’t apply to you. I’m sorry, but they do.

As @Mercer said, that would be scientific fraud.

It would also have to be happening on a colossal scale. If radiometric dating really were so unreliable that it couldn’t tell the difference between thousands and billions, scientists would have to be throwing out 99% of their results at least. And bear in mind that radiometric dating is expensive. Dating a single sample can cost thousands of dollars. That kind of chicanery would have scientists spending a million dollars at least on each result that gets published. And with tens of thousands of results being published in the scientific literature every year, you’re looking at tens of billions of dollars a year being squandered on wholescale scientific fraud.

I’m sorry, but what you’re describing here is the mother of all conspiracy theories. NASA faking the moon landings, aliens at Area 51, 9/11 being a false flag operation, and the US Navy covering up the existence of mermaids are child’s play by comparison. Conspiracies on that scale simply do not happen. Period.

6 Likes

You do not need to buy the cow for the milk. The Lalonde AMS facility will train you to run your own samples.

We encourage students, technicians, and professors to come to Ottawa to process their own samples for radiocarbon analysis. Other than covering the analytical cost, the program is free!

In any event, while OEC may be compatible with C14 analysis, YEC clearly is not.

2 Likes

That was my point. You may not have intended to say “fossils were completely replaced”. Some organic components are also not infrequently preserved, such as in plant fossils preserved as charcoal.

Why is that wrong? Are you claiming that those dinosaur bones are only a few years old? Certainly the K/T boundary is one of the most clearly marked and well-dated horizons in the Phanerozoic.

3 Likes

Mercer (C50),

PaulGiem: Here is the way one can unconsciously bias the data: One measures the relevant isotopes, and makes the relevant calculations, but as a check, one includes a presumably correct date based on, say, the geological stratum (or more precisely, the interpretation of the geological stratum).
It would be scientific fraud if one did so without explicitly stating that.
I would be a little easier on your colleagues. AFAICT, it is standard practice to ask the geological formation from which a sample was taken. From that, it takes only a little effort to find out what the expected age should be. If it is common knowledge in the field, is it scientific fraud to not explicitly say what everyone in the field already knows?
PaulGiem: A date that does not match the presumably correct date is looked at carefully, and if one can find a flaw in the specimen one throws it out. If we are lucky, it may be reported as a variant, and if not, it may be omitted without comment.
Omitting data in that scenario also constitutes fraud IMO.

I will agree that it does not constitute best practices. But I am reluctant to call it fraud, unless we can establish intent. I will give you 2 examples from carbon-14 dating. When we first dated the bones reported in the study I cited in C46 (I actually obtained the bones from Dr. David Stronach and paid the, at that time functioning, lab at the University of California at Riverside to do the date) the date we got was 810 ± 105 BCE (uncorrected), for bones that were securely archaeologically dated at 622 BCE. The lab was understandably concerned, and asked if they could send the bones to another lab. That lab’s date was significantly older (unofficially in the neighborhood of 1000 years older), and they did not believe their own results, and did not give me a date or send me a bill. Was that wrong? Wrong enough to try to shut down their lab? If you were that lab director, what would you do?

Second, during my research into carbon-14 dates of the era in question, I came across several dates from a log found from the 614 BCE destruction of Calah. All of them clustered around 720 BCE uncorrected, except for the British Museum, and one of 2 dates from Dublin. The Dublin lab reported the one that matched the British Museum, which, IIRC, was the first, and reported the second, higher, date that more closely matched the (later) others as a footnote (it appears that it almost didn’t make the paper). The British Museum date was later retracted, as were a number of other British Museum dates. One can make a rational argument that the second date should have been the reported one, and the first date should have been the footnote. I personally like this procedure better than the first example, but am loathe to accuse anyone of fraud, as the intent, IMO, is not there.

I also had a colleague tell me that a lab that was ~30% off on a date never charged him.

Getting rid of such problems would be an obvious advantage to a blinded study where all results are reported.

John_Harshman (C53), you say,

PaulGiem:

To be fair, I think that most paleontologists have agreed that if we find hydroxyapatite in bone, it is probably original material. And calcium carbonate in limestone is usually agreed to be original material.

That was my point. You may not have intended to say “fossils were completely replaced”. Some organic components are also not infrequently preserved, such as in plant fossils preserved as charcoal.

You are correct that I misstated the case. The original comment I was reacting to was T_aquqticus (C32):

Creationists have not demonstrated that the tissues found in dinosaur bones could not survive for 65 million years. They simply assert that they can’t survive that long without a shred of evidence to support the assertion.

My response (C38) was overly broad:

If your memory is as long as mine, you may remember being taught that fossils were completely replaced by minerals because they were millions of years old.

But with the modification of “fossils” to “soft tissues in fossils”, the statement is defensible. Thank you for pointing out the inaccuracy. BTW, Coal also appears to be original, though highly modified, fossil material.

While we are at it, I may have been overly broad in another statement. In C47 I stated, regarding U-235/Pb-207 versus U-238/Pb-206 dating.

Well, there goes the claim that isochron ages usually match each other.

I had not established that the relevant ages were isochron ages rather than model ages. And this may vary from paper to paper. But I think I can say that it is common for the ages not to match, however they are done.

jammycakes (51),

I have read Kevin Henke’s analysis. He seems to me to be angry, and lacking perspective (the two may be related). However, some points he makes are reasonable, particularly the recalculations of Loechelt, and warrant careful review. My point would be that if the perspective of Humphrey et al. can be defended against Loechelt, and if his theory can be shown to be generally predictive, I would be willing to give it some weight. I agree with you that accelerated decay during a Flood raises problems with heat (not to mention radiation) that may well be insuperable.

Paul Giem:

They also documented radiometric dates that did not match each other, which is relevant to our discussion. Whether this state of affairs is usual remains to be determined.

They blow the discrepancies completely out of proportion.

All they have been able to demonstrate is that a minority of radiometric dates can be out by about 15-20% of the half-lives of the isotopes concerned.

Chapter 6 of the RATE II book (Available online free at http://www.icr.org/i/pdf/technical/Isochron-Discordances.pdf ) actually shows a range in the Cardenas basalt (conventional age of 1103 ± 66 Ma by Rb/Sr) of 516 ±30 Ma for the K/Ar whole-rock isochron age to 1111 ± 81 Ma for the Rb/Sr isochron age, to 1585 ± 170 Ma by Sm/Nd isochron, which would exceed the 20% quoted above. Both chapters 5 and 6 seem to have plenty of discordant data. It would be helpful to have unselected data so we could understand how common this phenomenon is. As you comment,

There is a difference between “doesn’t always work” and “never works.” There is also a difference between “occasionally out by a few percent” and “consistently out by a factor of a million.”

Only unselected data can answer that question.

RonSewell (C52),

Thanks for the tip. I had not noticed that offer before. Of course, it will be valid only after Canada feels comfortable about COVID.

jammycakes (C51)

PaulGiem:

Note that the two explanations of the data are not identical, although I suppose one could try a combination of them. The reason for pointing this out is not so much to argue who is right (that would take another post, if not more experiments) as it is to point out that the experts do not believe the laboratory contamination hypothesis. It would appear that the laboratory contamination hypothesis is in fact functioning as a rescuing device.

Well of course it will be a combination of them. That’s simply how measurement works in every area of science. You don’t just dismiss one source of error as a “rescuing device” just because it can’t account for the discrepancy in isolation. You have to add all the possible sources of error up together.

You can properly say that laboratory contamination is functioning as a rescuing device if it is being used as an explanation in isolation, particularly when in an experiment at a particularly well-functioning lab it is shown to be inadequate in isolation.

Now I agree that one must add up all possible sources of error. But I would ask if you have made any estimates of the amount of error one can expect from either nucleosynthesis or in situ contamination. I’d like to assume that this is not just a handwaving argument. The preliminary data and calculations I have seen seem to indicate the causal inadequacy of these theories also, either individually or in combination. Perhaps you have other data and/or calculations.

With discussion of carbon dating here, this new thread may be of interest.

It is standard practice in every area of science to start off with a ball park estimate of how old, or how large, or how heavy, something is. This is simply so that you know to use the correct tool for the job. From that, you take measurements in order to narrow it down.

So for example, they will start off knowing that a rock formation is from Cretaceous strata. In other words, somewhere between 145 million and 66 million years old. This will simply tell them that they should be using uranium-lead, potassium-argon or rubidium-strontium methods, rather than, say, radiocarbon or thermoluminescence dating. From that, they will then be able to pinpoint the formation as being, say, 66.038 ± 0.011 million years old.

It’s exactly the same principle as if I were to measure some furniture, such as a desk, to see where I could put it. I would start off thinking that it must be roughly two metres by one metre by one metre. That would tell me to use a tape measure rather than an electron microscope, a micrometer, the odometer in my car, a handheld GPS device, or the Hubble Space Telescope.

Honestly, this is something I see happening quite frequently in young Earth arguments. They take techniques that are perfectly legitimate and standard practice in every area of science, and by ignoring the fact that there is actual maths involved, twist them to make them look like there’s some kind of fudging going on.

Well yes, I’d agree there. He does seem angry, and it is pretty off-putting. But he does make some important points about quality control and quality assurance, and he does provide quite a lot of detail about the standards that geologists are expected to meet in conducting their research. For example, he pointed out that if there are any questions about the integrity about a particular experiment, you are expected to discard those results and re-do the experiment, not to “correct” the results to account for “typographic errors” in the reports, as the RATE team did with a set of results from twenty years previously.

That must qualify as the understatement of the century. Accelerated nuclear decay doesn’t just “raise problems,” it is complete science fiction. The problems that the RATE team reported (and downplayed to a staggering extent) barely scratch the surface of the problems with their hypothesis. Besides the heat problem, it would have required the fundamental constants of physics to have changed. This would have had extremely far reaching consequences. At best, you’re talking about the Earth being turned into a cloud of hot plasma. At worst, you’re talking about a false vacuum decay.

Well that may be so, but the question is still, just how discordant is the data? The discordances noted are still only a factor of two at most, and the biggest discrepancy is K/Ar, which is the most susceptible to leakage since argon is a gas. For the others, the disagreement is no more than twice the size of the error bars at most. I’m sorry, but to cite differences of that scale as evidence for errors of a factor of a million is still blowing things completely out of all proportion.

No you can’t say that laboratory contamination is functioning as a “rescuing device.” Period. End of story. No exceptions, no excuses. Fully and correctly accounting for contamination is a fundamental requirement in every area of experimental science. Especially when you are making extraordinary claims that hundreds of thousands of results taken using dozens of different methods are all consistently out by factors of a million. As I said, to describe laboratory contamination as a “rescuing device” is to demand an exemption from the basic rules and standards of laboratory technique.

Laboratory contamination has been measured. This can be done by preparing a sample for radiocarbon analysis twice and measuring the difference between the first and second analyses. The difference turns out to vary widely with values between 0.14% and 0.25% of modern carbon being typical for each step in sample processing. And that is assuming that the experimenters concerned took great care in handling and preparing their samples – even relatively minor sloppiness, carelessness, or experimenter error would only increase the errors. Given that radiocarbon dating typically requires several steps of sample preparation (especially if you have to extract cellulose from wood or collagen from bone), contamination levels of 0.5% modern carbon are completely unsurprising.

The RATE team might have been able to claim that something warranted explanation if they had consistently reported levels of contamination of around 5% modern carbon, with a clustering of results around that figure. But levels of up to 0.5% or so with a very wide variation most certainly do not. And if outliers greater than that only crop up occasionally rather than on a regular basis, then those would almost certainly be down to either poorly studied contamination vectors, or experimenter error, or both.

6 Likes

You have just expanded my knowledge of Quantum physic by a small notch. I knew the heat problem was easily life-ending, but false vacuum decay is … not sure what to call it … universe-as-we-know-it-ending?

I wrote IF, Paul. You haven’t shown that anyone has done that.

2 Likes