Zeroing in

Science is like an archer getting closer to the target with practice—and an ever-improving view of the remaining discrepancy. That the aim varies as it zeroes in does not make it wrong along the way—and only a fool would think so.

Estimates for the age of the Earth have evolved over time as new scientific methods have been developed and as new data has been collected.

  • From the 1770s to the 1890s, Earth’s age could only be guessed at (scientifically speaking) based on a crude understanding of natural processes such as geolologic change, planetary cooling, and ocean salinity balance, so estimates ranged wildly from a few million to a few billion years.
  • 1905: The physicist Ernest Rutherford suggested that the age of the Earth could be estimated by measuring the amount of lead in uranium minerals. His estimate was around 500 million years, but was only a swag intended to prod geologists into the atomic age.
  • 1920s: The geologist Arthur Holmes used the radioactive decay of lead and uranium to estimate that the Earth was around 4.6 billion years old. This estimate is still widely accepted today, although the margin of error has been refined over time.
  • 1990s: The development of new radiometric dating techniques, such as uranium-lead dating and samarium-neodymium dating, allowed scientists to estimate the age of the Earth with greater precision. These methods have estimated the age of the Earth to be around 4.54 billion years old, with a margin of error of around 1%.

 

  • 1917-1922: The first estimate of the age of the universe came from astronomer Georges Lemaître, who used Einstein’s theory of general relativity to suggest that the universe was around 10 billion years old. This estimate was based on assumptions about the expansion rate of the universe and the amount of matter it contained, but it did not have a margin of error.
  • 1920s-1930s: Other astronomers, such as Arthur Eddington and Edwin Hubble, proposed different estimates of the age of the universe, ranging from a few hundred million years to several billion years. These estimates were based on observations of the Hubble constant, the rate of expansion of the universe, and the ages of the oldest stars in our galaxy.
  • 1940s-1950s: With the discovery of nuclear reactions and the ability to measure isotopes, physicists were able to estimate the age of the universe more precisely. In the late 1940s, physicist George Gamow and his colleagues suggested an age of around 2 billion years based on calculations of the age of the oldest rocks on Earth. By the early 1950s, improved measurements of the Hubble constant led to estimates of 10-20 billion years with a margin of error of about 25%.
  • 1960s-1970s: The discovery of cosmic microwave background radiation in 1965 provided strong evidence for the Big Bang theory and allowed scientists to refine their estimates of the age of the universe. In the late 1960s and early 1970s, estimates ranged from 10-20 billion years with a margin of error of about 10%.
  • 1980s-1990s: With more precise measurements of the cosmic microwave background radiation, estimates of the age of the universe improved further. By the 1990s, estimates were in the range of 13-15 billion years with a margin of error of about 1-2%.
  • 2000s-present: Advances in technology and new observations, such as measurements of the cosmic microwave background radiation by the Wilkinson Microwave Anisotropy Probe (WMAP) and the Planck satellite, have allowed scientists to refine their estimates even further. Current estimates are in the range of 13.7-13.8 billion years with a margin of error of about 0.1-0.2%.