Resetting a clock from Earth’s rocks

A refinement in a widely used technique for determining the age of ancient rocks

opens up the possibility that Earth may have formed a crust as many as 200 million

years earlier than geologists thought.

Scientists can estimate the age of rocks by measuring the proportions of certain

radioactive isotopes in them. Most of these isotopes have a half-life–the amount

of time it takes for half of the unstable element to decay–between 100,000 and 1

trillion years, says Erik E. Scherer, a geochemist at Münster University in

Germany.

One of the isotopes that scientists find most useful for dating some of the oldest

rocks is lutetium-176. That form of the rare earth element decays into hafnium-176

and has a half-life of about 37 billion years. The ratio of hafnium-177–a

nonradioactive form of that element–to hafnium-176 gives scientists a way to

estimate the age of minerals that contain these isotopes. Zircon crystals, which

researchers have used to date Earth’s crust, are an example, Scherer notes.

Since the early 1980s, scientists had thought that about 194 out of 10 trillion

lutetium-176 atoms decay each year. Better measurements have now reduced the

annual rate of decay to about 186 atoms out of 10 trillion. That means that rocks

scientists have dated using this technique are actually about 4 percent older than

previously thought.

The oldest known zircon crystals have a ratio of hafnium-177 to hafnium-176 that’s

lower than the planet’s overall average, Scherer says. Using the previous decay

rate for lutetium-176, researchers estimated those rocks to be 4.1 billion years

old. If scientists use the updated rate, that age goes up to 4.3 billion years.

In a commentary accompanying the report in the July 27 Science, Jan Kramers, a

geophysicist at Bern University in Switzerland, says that this refinement could

help fill in details of the era before Earth’s crust first solidified.

More Stories from Science News on Earth