However, rather than accept the biblical account of creation, many Christians have accepted the radioisotope dates of billions of years and attempted to fit long ages into the Bible.
All our calculations could be correct (observational science), but the result could be wrong.
This is because we failed to take into account some critical assumptions.
Scientists use observational science to measure the amount of a daughter element within a rock sample and to determine the present observable decay rate of the parent element.
Dating methods must also rely on another kind of science called historical science. Determining the conditions present when a rock first formed can only be studied through historical science.
If we walk into a room and observe an hourglass with sand at the top and sand at the bottom, we could calculate how long the hourglass has been running.
By estimating how fast the sand is falling and measuring the amount of sand at the bottom, we could calculate how much time has elapsed since the hourglass was turned over.
Once the rock cools it is assumed that no more atoms can escape and any daughter element found in a rock will be the result of radioactive decay.
The dating process then requires measuring how much daughter element is in a rock sample and knowing the decay rate (i.e., how long it takes the parent element to decay into the daughter element—uranium into lead or potassium into argon). Half-life is defined as the length of time it takes half of the remaining atoms of a radioactive parent element to decay.
If scientists fail to consider each of these three critical assumptions, then radioisotope dating can give incorrect ages.
We know that radioisotope dating does not always work because we can test it on rocks of known age.
Determining how the environment might have affected a rock also falls under historical science. Since radioisotope dating uses both types of science, we can’t directly measure the age of something.