Despite this, the momentum gained in the two decades prior to 1972 has made 4.5 b.y.
a popularly accepted “universal constant” even though the foundations on which it was based have been virtually removed.
For inorganic materials, such as rocks containing the radioactive isotope rubidium, the amount of the isotope in the object is compared to the amount of the isotope's decay products (in this case strontium).
The object's approximate age can then be figured out using the known rate of decay of the isotope.
Although we now recognize lots of problems with that calculation, the age of 25 my was accepted by most physicists, but considered too short by most geologists. Recognition that radioactive decay of atoms occurs in the Earth was important in two respects: Principles of Radiometric Dating Radioactive decay is described in terms of the probability that a constituent particle of the nucleus of an atom will escape through the potential (Energy) barrier which bonds them to the nucleus.
The energies involved are so large, and the nucleus is so small that physical conditions in the Earth (i.e. The rate of decay or rate of change of the number N of particles is proportional to the number present at any time, i.e.
After the passage of two half-lives only 0.25 gram will remain, and after 3 half lives only 0.125 will remain etc.
To see how we actually use this information to date rocks, consider the following: Usually, we know the amount, N, of an isotope present today, and the amount of a daughter element produced by decay, D*.
In 1972 this assumption was shown to be highly questionable.
But this sediment doesn't typically include the necessary isotopes in measurable amounts.
Fossils can't form in the igneous rock that usually does contain the isotopes.
For organic materials, the comparison is between the current ratio of a radioactive isotope to a stable isotope of the same element and the known ratio of the two isotopes in living organisms.
Radiocarbon dating is one such type of radiometric dating.