How? 2
After days ago, I started reading up on carbon dating (or radiocarbon dating) because it was something I’d discussed with my atheist coworker. Neither of us knew any specifics. My brief research has only brought up more questions. Specifically, the Wikipedia page on radiometric dating claims:
The uranium-lead radiometric dating scheme is one of the oldest available, as well as one of the most highly respected. It has been refined to the point that the error in dates of rocks about three billion years old is no more than two million years.
How do you determine the average error of a device that is quantifying an unknown value?
The most obvious solution would be to derive the error by using the same device to simultaneously calculate a known value. But it would seem that there are far too many variables occurring between the known values (for instance, the oldest known trees are ~7,000 years old) and the unknown values (in this case, 3 billion years), especially given an outlook that the world has been undergoing near constant change for the past several billion years.
Back to the Google.