jammycakes (C58),
It is standard practice in every area of science to start off with a ball park estimate of how old, or how large, or how heavy, something is.
That depends on how “ballpark” you are talking about. In medicine, we tell the lab that we want a blood test run, and don’t give them any more information than that it is blood. A quantitative hCG can run anywhere from 2 (or actually 0) to over 100,000. That’s quite a ballpark.
So for example, they will start off knowing that a rock formation is from Cretaceous strata. In other words, somewhere between 145 million and 66 million years old. … From that, they will then be able to pinpoint the formation as being, say, 66.038 ± 0.011 million years old.
Yes, that would work if all one said was that it was Cretaceous. But what if they were told that the formation was late Maastrichtian, or even the K-T boundary? That would narrow the time period considerably. BTW, I doubt that any lab would give you the data you quote. That is far too low an error for a single lab report. I suspect you got it from the Wikipedia date for the K-T boundary, or from the same source they did, which is almost certainly a composite date.
So have you submitted geological samples, and if so did you tell them that they were Cretaceous?
PaulGiem:I agree with you that accelerated decay during a Flood raises problems with heat (not to mention radiation) that may well be insuperable.
That must qualify as the understatement of the century. Accelerated nuclear decay doesn’t just “raise problems,” it is complete science fiction.
I’ll give you a little tip on how to sound less like Henke. Mostly let your data do the talking. If they don’t convince, do you really think that your statements are going to do the job? And when someone is agreeing with you, take the win graciously. (I am unlikely to ever say anything dogmatically, even agreeing with you, as I am cognizant that we have been surprised in the past; the bacterial cause of ulcers, the Bretz floods, and the theory of gravity itself come to mind.)
PaulGiem:Chapter 6 of the RATE II book …
Well that may be so, but the question is still, just how discordant is the data? The discordances noted are still only a factor of two at most, and the biggest discrepancy is K/Ar, which is the most susceptible to leakage since argon is a gas. For the others, the disagreement is no more than twice the size of the error bars at most. I’m sorry, but to cite differences of that scale as evidence for errors of a factor of a million is still blowing things completely out of all proportion.
You don’t seem to understand the point I was making. I have offered reasons why isochron dating might not be trustworthy, and the major answer I am getting is something like “but the dates all match, so they must be trustworthy.” Documentation that the dates in some cases do not match is relevant to the discussion, even if they are only 30-50% different. BTW, if you read the original article you will discover that the “error bars” are actually 2σ error bars or approximately 95% confidence limits, so the dates are quite significantly different.
If you are asking how different the dates are from (agreed to by all sides) real time, you might notice that, as noted in my paper and partially quoted in C46, Miocene to Recent lava in multiple instances dates in the 500 to 1,500 Ma range. Now we’re talking a factor of over 100. That is concerning. How can we (or can we) exclude such samples before we ever test them, so we don’t have to exclude them afterwards (which I agree with you is not best practice)?
No you can’t say that laboratory contamination is functioning as a “rescuing device.” Period. End of story. No exceptions, no excuses. Fully and correctly accounting for contamination is a fundamental requirement in every area of experimental science. Especially when you are making extraordinary claims that hundreds of thousands of results taken using dozens of different methods are all consistently out by factors of a million.
It helps your case if you know the position of your fellow discussant and don’t exaggerate. I realize that some whom you disagree with do believe that the dates are off by a factor of a million, but I would at present not put myself in that category. And dozens of different methods? Really? And you are right that contamination must be accounted for. But to use contamination as a complete explanation after it has been effectively debunked as a complete explanation is not proper, whether you like the term “rescuing device” or not.
Laboratory contamination has been measured. This can be done by preparing a sample for radiocarbon analysis twice and measuring the difference between the first and second analyses. The difference turns out to vary widely with values between 0.14% and 0.25% of modern carbon being typical for each step in sample processing.
Just wow. I can only conclude that you are getting your information from some internet site, and really don’t understand the primary literature (nor do they, assuming you repeated their material accurately). There are two estimates I can cite for contamination, and did so in my paper in Origins in 2001 ( Geoscience Research Institute | Carbon-14 Content of Fossil Carbon ):
Van der Borg et al. (1997) noted graphite to have 0.04±0.02 pmc when measured without reprocessing, and 0.18 pmc when tested after recycling. Arnold et al. (1987) reported a graphite having 0.089±0.017 pmc without recycling, and 0.34±0.11 pmc after recycling (statistically significant at p<0.025).
These happen to match almost exactly the figures you quoted (the only difference, a non-significant and probably not technically justified one, is .251 instead of .25 for the second reference). Thus the contamination quoted is not for each step, but for the entire oxidation-reduction laboratory process.
But there is a worse problem with what you said. Contamination varies from laboratory to laboratory. Natural gas measured by Beukens, for example (1992. Radiocarbon accelerator mass spectrometry: background, precision, and accuracy. In: Taylor RE, Long A, Kra RS, editors. Radiocarbon After Four Decades: An Interdisciplinary Perspective. NY: Springer-Verlag, p 230-239.) had a total measurement of 0.077 ± 0.005 pMC, putting a limit at that lab for the oxidation-reduction process of less than 0.09 pMC. And Bird et al. (1999: https://repository.arizona.edu/bitstream/handle/10150/654594/3802-3473-1-PB.pdf?sequence=1 ) noted that their treated and untreated samples of “Ceylon graphite” were not statistically different, and their highest value (for an untreated sample!) had 0.08 ± 0.03 pMC (their average was closer to 0.04 pMC). So at the best labs laboratory contamination can be reduced to quite a bit less than 0.14 pMC.
I agree that it is probably not zero (except possibly at Bird’s lab). But it does not account for all the results of Baumgardner et al., especially as their samples were done at one of the better labs, and an apparent background was in fact subtracted from their reported coal data.
I also agree that contamination underground, contamination during procurement, and nucleosynthesis should be explored, and if possible quantified, so that we can scientifically reach a conclusion as to whether they are adequate explanations (separately or together) for the carbon-14 we are finding in coal (over and above laboratory contamination). I am seeing what can be done to test these hypotheses.
Dan_Eastwood (C57),
I’ll take a look at that discussion.