“[I]t is now again on the increase. It is, and has been for a month, brighter than [the star] Canopus. Half-way indeed between him and Sirius, and very red.”
—Astronomer Charles P. Smyth, from letter dated Jan. 1, 1845
Late in 1837, a dim Milky Way star called Eta Carinae suddenly blossomed and soon became the second-brightest star, next to Sirius, in the night sky. During the next 19 years, a period now known as the Great Eruption, the star ejected two billowing, mushroom-shaped gas clouds, each 100 times as wide as our solar system and containing enough mass to make 10 suns.
Eta Carinae suffered another substantial, but less dramatic, eruption in 1890, and evidence is accumulating that it had undergone several other severe outbursts during the past 10,000 years. Even today, the star, which tips the scale at a whopping 100 solar masses, hurls the equivalent of two Earths of gas and dust into space each day.
Although the outbursts have made Eta Carinae a striking spectacle, astronomers have long regarded the star as a freak of nature. Astronomical models indicate that most heavyweights expel large amounts of matter before they die, but that they eject this material relatively slowly, over their entire 3-to-4-million-year lifetimes. But new evidence—a combination of theory and observations—suggests that Eta Carinae’s temperamental behavior may be the norm, not an anomaly, among extremely massive stars.
Simulations indicate that the first stars to form were all extremely massive. Because these stars were the main sources of every element heavier than helium in the early universe, the evidence of widespread temperamental behavior may prompt a new look at how the cosmos acquired its assortment of chemical elements, including those necessary for life. The findings may in particular shed new light on the fate of the first stars in the universe.
“There’s a paradigm shift in our understanding of massive stars” that may affect views of how these stars live and die, the remnants they leave behind, and their contribution to the chemical makeup of the early universe, says Nathan Smith of the University of California, Berkeley.
Winds of change
All massive stars lead short lives. The heaviest ones have a life span only about a thousandth of that of a star such as the sun. They can weigh 50 to 150 times the mass of the sun at birth, but during their life spans, they lose much of that mass via a steady, outgoing wind. Eventually, they die a fiery death in an explosion called a supernova. Before exploding, these stars, then known as Wolf-Rayet stars, have lost their outer atmosphere and slimmed down to a mere 10 to 20 times the mass of the sun.
Until the explosion, the stars burn hydrogen at their cores, transforming it into helium. The nuclear reaction creates ultraviolet (UV) radiation, which ionizes an array of elements in the stars’ outer layers. These ionized atoms absorb the radiation, and the kick imparted by the process blows off gas, creating a continuous, outward flow of matter. But just how much mass those winds carry away is now an open question, Smith and others say.
Astronomers had assumed that the UV-absorbing ions are distributed smoothly and uniformly throughout a star’s outer layers. Calculations using that assumption show that the winds are indeed intense enough to put heavy stars on a slow but steady diet, reducing their mass by tens of solar masses over several million years.
Recent studies indicate that observers may have overestimated the strength of stellar winds. The new data show out that the material that absorbs radiation is unevenly distributed in the atmospheres of stars.
Researchers measure the brightness of a star to deduce the amount of mass carried off by a wind. The greater the emission, the larger the mass loss. But an atmosphere that consists of dense clumps of ions will radiate more strongly than an atmosphere containing the same amount of material distributed more uniformly. If astronomers don’t account for the higher intensity of light emitted by a clumpier atmosphere, they can be fooled into thinking that the wind carries away more mass than it really does.
“Currently accepted mass-loss rates may need to be revised downward as a consequence of previously neglected clumping,” note Joachim Puls of the University of Sternwarte in Munich and his colleagues in a review article recently posted on the Internet (http://xxx.lanl.gov/astro-ph/0607290).
According to Smith, Stanley P. Owocki of the University of Delaware in Newark, and other researchers, the total wind may be only one-tenth as strong as models had indicated. That’s too gentle to blow away all the matter that astronomers know must be expelled by massive stars before they explode.
“Steady winds are simply inadequate for the envelope shedding needed to form a Wolf-Rayet star,” Smith and Owocki note in the July 1 Astrophysical Journal Letters. Smith also reviewed the evidence for episodic outbursts among heavyweights in May at a meeting on massive stars at the Space Telescope Science Institute in Baltimore.
Is odd normal?
Instead of the weight-loss-by-wind theory, researchers now propose that most extremely massive stars slim down by undergoing extraordinarily violent eruptions like the one that convulsed Eta Carinae in the mid-1800s.
The eruptions would occur during the era just before a massive star enters its Wolf-Rayet phase. During this stage, heavy stars such as Eta Carinae are known as luminous blue variables. This phase lasts for less than 100,000 years—a mere blink in astronomical time.
One eruption wouldn’t be enough to shed all the mass, but several at different times during the entire luminous-blue-variable era would suffice, Smith proposes.
His idea is more than just a pie-in-the-sky theory. For example, nested shells of material surrounding the mushroom-shaped clouds recently cast out by the star suggest that Eta Carinae had in fact suffered previous outbursts over several thousand years. There’s compelling circumstantial evidence that the star had undergone eruptions similar to the one witnessed some 150 years ago, says theorist Mario Livio of the Space Telescope Science Institute.
Astronomers don’t fully understand what set off Eta Carinae, but a few other stars seem to have undergone similar outbursts. A Milky Way star called P Cygni, which brightened and shed a tenth of a solar mass in 1600, may have undergone even fiercer outbursts over the past few thousand years, Smith and others note. Furthermore, astronomers have recently identified in other galaxies several stars that they call “supernova imposters.” These stars haven’t yet blown themselves to bits in supernovas, but their eruptions are extremely bright and energetic.
Indeed, some of these stars resemble models of what an Eta Carinae–like eruption might look like a few thousand years after it happened, notes astrophysicist Paul Crowther of the University of Sheffield in England.
What’s more, shells of material that surround some bona fide supernovas indicate that these once-massive stars ejected large amounts of material only a few thousand years before they exploded.
The challenge to proving Smith’s hypothesis, adds Crowther, is the brevity of the luminous-blue-variable era. Massive stars are rare, and it’s hard to find one that is actually in that brief phase of evolution. “We have only a very small number of these objects to play with,” Crowther says.
While acknowledging the merits of Smith’s work, Crowther says that he’s not entirely swayed by the arguments. In the old scenario, he notes, winds accounted for all the mass lost by heavy stars. In the new picture, powerful eruptions over a short time either replace or overshadow the wind scenario.
“Nathan [Smith] sells a great story,” says Crowther, but “I think the reality is somewhere in between” those two pictures.
The recognition that massive stars may shed a significant amount of their heft through brief eruptions is likely to change the way astronomers think of these heavyweights, Crowther.
The presumed temperature, composition, turbulence, and other properties of these stars must differ if they expel most of their mass in a few late-stage, concentrated bursts rather than steadily throughout their lives, Smith agrees. Those properties, in turn, determine when a star finishes burning its main fuel, hydrogen, and how long it lives.
Determining when in its life a massive star spews material is crucial for understanding the chemical composition of the universe. Stars produce heavier elements as they age, fusing hydrogen into helium, helium into carbon, and carbon into oxygen. A steady wind driving out material early in the life of a star would litter the cosmos with lighter elements than would a series of late-stage eruptions. It’s too soon to tell exactly how this would alter estimates of the chemical contents of the cosmos, but the consequences are likely to be most dramatic early in the universe, says Smith.
Current theory holds that the first stars in the universe were much heavier than stars are today and that they ranged from 100 to several hundred times the mass of the sun. Those early stars contained only the elements forged in the aftermath of the Big Bang: hydrogen, helium, and traces of lithium. According to the old stellar-wind model of mass loss, the first stars wouldn’t have shed any material before dying as supernovas because the winds are generated only by the absorption of radiation by heavier elements.
In Smith’s eruption model, these stars would expel some material a few thousand years before they die as supernovas rather than stockpiling all of it until the bitter end. That’s because the proposed eruptions don’t depend on whether the star has made heavy elements. “The main question is whether a [first-generation] massive star shed most of its mass before or during a supernova event,” notes Smith.
A series of episodic Eta Carinae–like eruptions would have decreased the weight of early stars before they finally exploded as supernovas, perhaps influencing the type of supernovas they became.
According to most theorists, many of the heaviest stars in the early universe were obliterated by their explosions and blasted into space every heavy element that they had forged. But with less mass, some of those stars would be more likely to have left behind an ultradense cinder—a black hole—when they became supernovas. Several solar masses of iron and perhaps a few other heavy elements would be trapped inside the black holes and never make their way out into interstellar space.
Livio says that if eruptions such as those of Eta Carinae were common among the most-massive stars, “it may change significantly the [assumed] end products of those stars, including black holes and supernovas.”