Norman, I think you've added 2 and 2 and come up with 5. My 350,000 year date wasn't the only one, the largest was above your threshold, and was 2.8 million years for pyroxene.
The results of this analysis are shown in Table 1. What do we see? First and foremost that they are wrong. A correct answer would have been ‘zero argon’ indicating that the sample was too young to date by this method. Instead, the results ranged from 340,000 to 2.8 million years! Why? Obviously, the assumptions were wrong, and this invalidates the ‘dating’ method. Probably some argon-40 was incorporated into the rock initially, giving the appearance of great age. Note also that the results from the different samples of the same rock disagree with each other.
(Note the highlighted part) That difference is also found in the grand canyon, from the same rock flows you get WILDLY different dates.
Here is the ages from Geochron, for Mt St Helens rock which at the time was only about 20 years old if I remember correctly;
mt st helens.jpg
To be honest I would never trust this dating method again after this. How can you not know that rock that is say 500 years old isn't going to be measured by this test to be 10 X older? The point is you can't know, which is an important example of critical thinking, which I believe was the point of raising this issue. I note this type of critical analysis is absent from your own writings.
Well, if you use it on rocks that are outside the equipment specifications, then you shouldn't trust it. If it's a perfect sample and too young, then they it will drastically overestimate its age with large relative errors. If the sample is not perfect, e.g., excess argon at formation or eternal contamination, and the resulting extra argon is large compared to the natural argon (from radioactive decay), then again it will drastically overestimate its age with large relative errors.
But, that doesn't mean that the you should automatically discard the method. It depends on your data set. If the amount of argon rises above the noise floor and inherent errors in the system and you get consistent answers across multiple disparate samples, then you will have more confidence (perhaps even statistically meaningful) that you are getting good data with low relative errors.
Also, the Blue highlighted text is indicative that you are actually measuring noise which is more random in nature.
So the logical argument is: if the assumptions are correct, then a rock younger than the lower range should not have existed long enough for enough detectable Ar-40 (daughter isotope) to have been produced by decay of K-40 (parent isotope). Thus, according to the critic, we should not expect to be able to use the method. However, we find plenty of detectable Ar-40 in rocks known to be much younger. Therefore, by the logically valid argumentknown as denying the consequent or modus tollens, at least one of the assumptions behind radiometric dating is false: that there was no daughter isotope in the rock to start with.
So his point is there shouldn't be any Ar in the rock for it to yet even be detectable. To say 2.8 million years isn't a sample enough to be detectable Norman, is plain FALSEHOOD. The fact is new rocks contain argon meaning the assumption it takes millions of years for that argon to decay from potassium, is false, meaning how can you date "older" rocks without knowing how much of the daughter isotope was initially present?
Conclusion; It seems to me you're just trying to play down the results of the data, which clearly indicate a sizable amount of argon was present in the newly formed rocks from Mt St Helens. And it's not just Mt St Helens Norm';
In every case the potassium-argon dates were clearly wrong to a huge extent. Similar conflict was found by researchers in Hawaii. A lava flow which is known to have taken place in 1800-1801—less than 200 years ago—was dated by potassium-argon as being 2,960 million years old.3
Is this dating failure from Mount St Helens an isolated case of radioisotope dating giving wrong results for rocks of known age? Certainly not! Dalrymple,1 one of the big names in radioactive dating [and a self-confessed intermediate between an atheist and agnostic], lists a number of cases of wrong potassium-argon ages for historic lava flows (Table A). There are many other examples of obviously wrong dates. Only recently, Creation magazine reported that ages up to 3.5 million years were obtained for lava flows that erupted in New Zealand from 1949 to 1975.2
So Norman, does 2 to 3.5 million years go "above" the line? You have no way of knowing why you get the dates, they even dated the top part of the grand canyon they believe 1 million years old using one method, and got a date of 1 billion.
Possibly, but it depends, Mike. Specifically, it depends on the noise / error sigma within their system / equipment. For example, if sigma = 1 million years, then the probability of getting at least one in four measurements above 2 million years (strictly from noise and error alone) is about 44%. So, actually Mike, that's not that improbable. If there sigma is 500,000 years, then the probability becomes much less....less than 1 %....and in that case you could make a stronger argument for some of the samples having external / excess argon.
I don't know what Geochron's sigma was....it would depend on their own self-defined accuracy threshold.
Finally, you are wrong about the precision of measurement, the -/+ sign would only be there, if it meant something, as he explains here;
The results presented in Table 1 of Mr Swenson’s article shows that there was plenty of argon in the Mt St Helens samples. There was an equivalent of 2.8 million years of argon in the pyroxene sample. It was so abundant that the instrument could measure it to a precision of ±0.6 million years. The feldspar sample also had lots of argon and gave an age of 340,000 years. It was some six times more abundant than the precision of measurement which was equivalent to ±60,000 years.
The reason they give the -/+ date is that is how much room for doubt there is, that is the limits of the test. Think about it - why would it give wiggle-room if that wiggle-room is false? That is tantamount to saying the evolutionists that tested the rock are LIARS by giving wiggle room of 600,000 years. So it's special pleading fallacy to say, "that wiggle room is acceptable with dates much older".
Notice there is a different "wiggle factor" for different dates but the dates themselves are many times greater than the wiggle-factor.
So, Mike, how many individual samples are included in the 600,000 figure? Because that value is always included in their confidence interval estimates. As i described in detail in the previous thread, the reason for this is that the noise / error component or vector is often considered independent from sample to sample....and that's a good thing. If you have a strong, consistent argon signal throughout your samples and they are well above the noise / error floor, then you can reduce your confidence bounds on the order of ~SQRT(N) (based on the Central Limit Theorem) where N is the number of samples. However, this becomes less and less true the closer and closer the argon signals get to the noise / error floor defined by the equipment. If the signals are below the noise / error floor, then it doesn't apply at all.
So, in short, Mike, the 2 million and 600,000 year values are not necessarily incompatible as it's somewhat of an apples to oranges comparison. The 600,000 years applies to a given number of individual samples with argon signals well above the noise / error floor which for Geochron seems to have been the amount of argon from ~ 2 million years of decay.
(Think Norman.....think.............don't just collect data in your head, but think through the consequences of that data; for there should be no argon at all, if argon only gets there from decay over millions of years, meaning that logically this proves that K-Ar dating can't give you accurate dates because you don't know how much daughter isotope was created at the time the rock formed. For all you know world-scale lava flows from a catastrophe could create even greater inflations. Have you tested that scenario?
Again, Mike, based on that post of mine, this is completely wrong. Even if there is no argon present, they will always measure some level of argon. It's unavoidable. Now, how much they measure depends on the quality of the equipment.
Norman: "Now, in that light, Geochron's 2 million year disclaimer is relevant to the discussion as it is a description of just that
WHEN did they make that diclaimer? It contradicts the "wiggle factor" they gave for the Mt St Helens rock.
It's like me saying this; "I love creationists and HATE evolutionists and I believe all evolutionists will score on an IQ test I have made, 10 or below." Now imagine someone anonymously scores 80, and I later find out the person was an evolutionist so I release a new comment saying; "I believe anyone of an IQ of 85 or less is a retard."
I haven't heard Geochron say anything, all I have heard is some hearsay from your lips, that they said something.
I've read it numerous times. Have I seen it from Geochron themselves? No. I recently went to their website and they did not have that disclaimer up. But, they also don't do K-Ar dating anymore.
You said that they used flame photometry....and most everything that I have read about it describes it as an older (developed in the 40s), cruder (but cheaper) method than spectroscopy. So, I don't find the 2 million year very hard to believe. But, if you wish to think otherwise, that's your business.
As a critical analyser my questions are therefore this;
1. did they really say it, and did it mean what Norman thinks it meant?
2. WHEN did they say it? Did they say it in response to claims that young rock has been dated as old?
3. There own "wiggle" factor means that if they date a rock to 1 million years old and there is a wiggle-factor of 600 thousand years then mathematically and logically that rock MUST have argon in it that is an equivalent amount of at least 400,000 years old, which contradicts their claim that "2 million" years is the wiggle factor, because it is sufficient evidence that amount of argon is present.
I believe their website and what it says NOW, is no longer relevant to the tests done decades ago, when the website didn't exist and I'm not sure the internet even existed.
If you are saying, "now more modern methods would say these dates are wrong" then you have to apply that logic to all of the dates for rock which evolutionists say are millions of years old, for if they used that method they also have to be thrown out.
Again, Mike, this is faulty logic; you need to go back and read my post from the other thread where I walk through the math pretty thoroughly. When the argon signal is strong and well above the noise floor, then you can measure the amount of argon much more accurately than you can if the argon signal is weak and drowned out by the noise / error in the system. It's just good old fashion math. If you want me to provide you with real numerical examples, I will.
Using a more modern method, the excess argon would still be present in the Mt St Helens rock. That isn't disputed even by experts, but because you don't know enough about this issue, it's clear to me that you are making guesses about these tests, but the tests themselves find certain types of argon present. Austin explains that in his article on Mt St Helens. The argon is there, that isn't debated, which proves argon can be present in new rocks.
Whether or not excess argon is present and how much is independent of the method used. The question is how small of an amount of argon can be accurately measured. I've read that modern spectroscopy methods can get the lower age bound down to ~100,000 years. If a sample gives you an answer back that is less than that, then it's hard to discern what you are actually measuring.
You see Norman, it seems to me you think scientists don't admit to excess argon in rocks. It's known about, there are even experiments like this, which 100% prove the argon can get there, without decaying, when exposed to heat;
When muscovite (a common mineral in crustal rocks) is heated to 740°-860°C under high Ar pressures for periods of 3 to 10.5 hours it absorbs significant quantities of Ar, producing K-Ar "ages" of up to 5 billion years, and the absorbed Ar is indistinguishable from radiogenic argon (40Ar*).2 In other experiments muscovite was synthesized from a colloidal gel under similar temperatures and Ar pressures, the resultant muscovite retaining up to 0.5 wt% Ar at 640°C and a vapor pressure of 4,000 atmospheres.3 This is approximately 2,500 times as much Ar as is found in natural muscovite. Thus under certain conditions Ar can be incorporated into minerals which are supposed to exclude Ar when they crystallize.
I never made such a claim; there are well-known examples of excess argon as well as argon contamination. The debate here is centered on determining the amount of excess argon...or any argon for that matter that is present in a given sample. And I would suggest that using old outdated methods and equipment on very young rocks is not the best way to do that.