Radioactive dating is the procedure of calculating an age for an artifact by determining how much of the radioactive material has decayed and calculating how long that would take given the half-life (how long it takes for half the material to decay) of the material being tested.Several dozen methods exist, using different radioactive isotopes and decay products, with varied dating ranges and precision.The procedure, however, is difficult, and many tests have shown that it can be inaccurate, and it is at times not even considered reliable by mainstream scientists.

Suppose we have a tank partly filled with water, and a hole in the bottom through which the water is leaking out of the tank.

We wonder how long the hole has been there, that is, how old the hole is.

We could measure (a) how much water the tank holds, (b) how much is still in the tank, and (c) the rate at which it is leaking out.

We can calculate the age of the hole by subtracting (b) from (a) to find out how much water has left the tank, and then dividing this by (c), the rate at which it is leaking out.

However, in doing so, we have, consciously or subconsciously, made a number of assumptions about other factors that could have affected the calculations.

Unless these factors are known, the calculated dates will not be reliable.

Sometimes a hypothesis must be made that may be plausible but has not been proven.

At other times an additional measurement can eliminate the need for one assumption, although no science can be done without assumptions at some level.

For example, isochron methods do not assume any particular concentration of the daughter isotope in the original sample, but calculate that concentration based on other measurements.

The reliability of that calculation will in turn depend on other conditions.

Given the complexity of radioactive dating, confirmation bias can also be a problem.