How?

After days ago, I started reading up on carbon dating (or radiocarbon dating) because it was something I’d discussed with my atheist coworker. Neither of us knew any specifics. My brief research has only brought up more questions. Specifically, the Wikipedia page on radiometric dating claims:

The uranium-lead radiometric dating scheme is one of the oldest available, as well as one of the most highly respected. It has been refined to the point that the error in dates of rocks about three billion years old is no more than two million years.

How do you determine the average error of a device that is quantifying an unknown value?

The most obvious solution would be to derive the error by using the same device to simultaneously calculate a known value. But it would seem that there are far too many variables occurring between the known values (for instance, the oldest known trees are ~7,000 years old) and the unknown values (in this case, 3 billion years), especially given an outlook that the world has been undergoing near constant change for the past several billion years.

Back to the Google.

2 Comments so far

  1. n1zyy on October 30th, 2007

    Taking a cue from your post, I asked Google how old the Earth was. A bit over 4 billion years, apparently.

    Sadly, my first thought was, “Oh, of course, that’s the size of a 32-bit integer.”

    And then, “Oh, wait. this is real life.”

    Either that, or THE END IS NEAR!

  2. andrew on October 30th, 2007

    Taking a cue from your post, I asked Google how old the Earth was. A bit over 4 billion years, apparently.

    Proof that Google doesn’t know everything? 😉

Leave a Reply