How do scientists determine the age of fossils?
A number of methods are used today to date fossils. Most of the methods are indirect—meaning that the age of the soil or rock in which the fossils are found are dated, not the fossils themselves. The most common way to ascertain the age of a fossil is by determining where it is found in rock layers. In many cases, the age of the rock can be determined by other fossils within that rock. If this is not possible, certain analytical techniques are often used to determine the date of the rock layer.
One of the basic ways to determine the age of rock is through the use of radioactivity. For example, radioactivity within Earth continuously bombards the atoms in minerals, exciting electrons that become trapped in the crystals’ structures. Using this knowledge, scientists use certain radiometric techniques to determine the age of the minerals, including electron spin resonance and thermoluminescence. By determining the number of excited electrons present in the minerals—and comparing it to known data that represents the actual rate of increase of similar excited electrons—the time it took for the amount of excited electrons to accumulate can be calculated. In turn, this data can be used to determine the age of the rock and the fossils within the rock.
There are other methods for determining fossil age. For example, uranium-series dating measures the amount of thorium-230 present in limestone deposits. Limestone deposits form with uranium present and almost no thorium. Because scientists know the decay rate of uranium into thorium-230, the age of the limestone rocks, and the fossils found in them, can be calculated from the amount of thorium-230 found within a particular limestone rock.