A plasma cocoon lets growing stars keep their X-rays to themselves. Laboratory experiments that mimic maturing stars show that streams of plasma splash off a star’s surface, forming a varnish that keeps certain kinds of radiation inside.
That coating could explain a puzzling mismatch between X-ray and ultraviolet observations of growing stars, report physicist Julien Fuchs of École Polytechnique in Paris and colleagues November 1 in Science Advances.
Physicists think stars that are less than 10 million years old grow up by drawing matter onto their surfaces from an orbiting disk of dust and gas. Magnetic fields shape the incoming matter into columns of hot, charged plasma. The same disk will eventually form planets (SN Online: 11/6/14), so knowing how quickly stars gobble up the disk can help tell what kinds of planets can grow.
When disk matter hits a stellar surface, the matter heats to about 1,700° Celsius and should emit a lot of light in ultraviolet and X-ray wavelengths. Measuring that light can help scientists infer how fast the star is growing. But previous observations found that such stars emit between four and 100 times fewer X-rays than they should.
One theory why is that something about how a star eats absorbs the X-rays. So Fuchs and his colleagues re-created the feeding process in a lab. First, the team zapped a piece of PVC representing the edge of the disk with a laser to create plasma, similar to the columns that feed stars. In space, a star’s gravity draws the plasma onto its surface at speeds of about 500 kilometers per second. The star’s strong magnetic field guides the charged plasma into organized columns millions of kilometers long.
There’s not enough room or gravity in the lab to reproduce that exactly, but the plasma physics is the same on smaller scales, Fuchs says. His team applied magnetic fields up to 100,000 times stronger than Earth’s to the plasma to shape it into columns and accelerate it to the same speed it would have in space. The researchers placed a target made of Teflon representing the star’s surface just 11.7 millimeters away from the PVC, a distance equivalent to about 10 million kilometers in space.
When the plasma hits the Teflon surface, the plasma begins to ooze sideways. But the magnetic field that holds the plasma in a column stops the plasma’s spreading. Plasma and magnetic field push against each other until the buildup of pressure between them forces the plasma to curve away from the surface and back up the column, coating incoming plasma with outgoing plasma.
“This cocoon is building up,” Fuchs says. It absorbs enough X-rays to explain the surprisingly wimpy X-ray emission of growing stars, the experiment found. The team also compared the experiment setup with computer simulations of feeding stars to show that the lab configuration was a good representation of real stars.
The comparison with computer simulations makes the experiment more reliable, says experimental physicist Gianluca Gregori of the University of Oxford. “There is this reality check,” he says. “In the astrophysical community, there’s a tendency to think that there are observations, and there are simulations. But what this paper tells is that there are other ways you can understand what happens in the universe.”
A THIN VENEER This simulation of a laboratory setup that mimics a feeding star shows how a column of plasma (center, yellow and blue) splashes off a surface representing a star. The plasma spreads upwards and coats the original column. The left side of the simulation represents the plasma density, and the right side represents temperature. G. Revet et al/Science Advances 2017