Too Much Deuterium?

A chemical mystery in the Milky Way

A new study appears to solve a 35-year-old puzzle among astronomers about the distribution of an isotope forged just after the Big Bang, but it poses new questions about the ways in which stars form in the Milky Way and how the galaxy was built.

GUIDE STAR. Astronomers used a satellite to record the ultraviolet light from the hot star AE Aurigae (blue halo near center of image). From such measurements, they calculated the Milky Way’s abundance of deuterium, an isotope of hydrogen forged just after the Big Bang. T. Rector, B. Wolpa, NOAO, AURA, NSF

Deuterium is a heavy isotope of hydrogen. Because stars consume large amounts of it and no process creates it in significant amounts, the amount of deuterium in the universe declines steadily. The deuterium abundance in the Milky Way is therefore an indicator of how much star formation has occurred over the 12 to 13 billion years since the galaxy’s birth.

Ups and downs

Observations in the 1970s with the Copernicus satellite revealed that deuterium concentrations vary widely across the Milky Way. That finding surprised astronomers who had figured that deuterium should be as evenly spread across the galaxy as normal hydrogen and other light elements are. Many researchers ascribed the data to errors in measurements. But a new analysis of measurements recorded by NASA’s Far Ultraviolet Explorer Satellite (FUSE) over the past 6 years confirms the perplexing result. The findings also offer a possible explanation.

According to astronomers working with the FUSE data, the deuterium abundance along different sight lines from Earth varies between 5 and 23 parts per million (ppm) in the galaxy. Moreover, these concentrations are inversely correlated with the amount of undisturbed carbon dust that a region contains. Jeff Linsky of the University of Colorado at Boulder and his colleagues describe the findings in the Aug. 20 Astrophysical Journal.

The results may be explained by a prediction made in 2003 by study coauthor Bruce Draine of Princeton University. He proposed that atoms of deuterium gas have more of a tendency to attach to dust grains than do atoms of normal hydrogen. Once an atom of hydrogen or one of its isotopes binds to dust, the spectrometer at the heart of FUSE can no longer detect the element.

Draine’s model also showed that supernova explosions can revaporize deuterium that had attached to dust. According to the model, FUSE observed low amounts of deuterium within 100 light-years of the sun because that region has a high abundance of carbon dust that’s been undisturbed for millions of years, Linsky says.

Last year, Jason Prochaska of the University of California, Santa Cruz and his colleagues linked deuterium concentrations in the Milky Way with the varying abundance of titanium. Nearly all titanium resides in carbon-rich dust grains, so his results are consistent with Draine’s model and the FUSE data. Prochaska’s team reported its findings in the Feb. 10, 2005 Astrophysical Journal Letters.

The studies make a compelling case that dust causes the variations in deuterium concentrations, says astrophysicist Max Pettini of the University of Cambridge in England. Monica Tosi of the INAF-Astronomical Observatory in Bologna, Italy, agrees.

In the clear

If dust hides some deuterium, then the truest reading of the isotope’s galaxywide concentration is the highest one, which would come from dustfree areas, Linsky asserts.

Therein lies a problem. The highest concentration in the galaxy—23 ppm—is uncomfortably close to the primordial value of 27 ppm, derived from several lines of evidence. If those values are correct, all the stars that have ever formed in the Milky Way would have consumed only 15 percent of the original deuterium.

Yet several models maintain that stars have gobbled up 30 to 40 percent of the initial allotment. New calculations from Tosi’s team in the mid-June Monthly Notices of the Royal Astronomical Society support that percentage.

Taken at face value, the abundance of deuterium proposed by Linsky and his colleagues indicates that either the primordial deuterium abundance was higher than scientists have inferred or there’s something missing from current ideas of how stars form and consume deuterium. The new value is also at odds with well-established correlations between deuterium and measured abundances of other elements in stars and the gases between them.

She and other astronomers say that they’re not convinced that the maximum value of deuterium recorded by FUSE is the value throughout the galaxy. Uncertainties in the craft’s measurements are relatively high, they note. Factors such as stellar and supernova winds might cause small but significant variations in deuterium readings.

Linsky suggests a way out of the dilemma: In cannibalizing nearby galaxies, perhaps the Milky Way incorporated gas richer in deuterium than its original gas.

“It is still too early to make any serious claims about chemical evolution” in the Milky Way, says Prochaska.

But it isn’t too early to speculate about it, says Pettini. The report by Linsky’s team is the most comprehensive study of deuterium in the Milky Way ever conducted, he says. It could prompt astronomers to reassess their view of the evolution of this isotope in the galaxy and its role as a signpost for star formation.

More Stories from Science News on Astronomy