Units of measure are getting a fundamental upgrade

Metrologists are revamping units using fundamental constants of nature

kilogram standards

TAKING MEASURE  New units based on fundamental properties of the universe will make measurements more precise. A kilogram cylinder replica (left) is safeguarded under bell jars. Ultrasmooth silicon spheres (right) will soon be used to redefine the kilogram using the Planck constant — making the kilogram prototype obsolete.

epa european pressphoto agency b.v./Alamy Stock Photo

If scientists had sacred objects, this would be one of them: a single, closely guarded 137-year-old cylinder of metal, housed in a vault outside of Paris. It is a prototype that precisely defines a kilogram of mass everywhere in the universe.

A kilogram of ground beef at the grocery store has the same mass as this one special hunk of metal, an alloy of platinum and iridium. A 60-kilogram woman has a mass 60 times as much. Even far-flung astronomical objects such as comets are measured relative to this all-important cylinder: Comet 67P/Churyumov–Gerasimenko, which was recently visited by the European Space Agency’s Rosetta spacecraft (SN: 2/21/15, p. 6), has a mass of about 10 trillion such cylinders.

But there’s nothing special about that piece of metal, and its mass isn’t even perfectly constant — scratches or gunk collecting on its surface could change its size subtly (SN: 11/20/10, p. 12). And then a kilogram of beef would be slightly more or less meat than it was before. That difference would be too small to matter when flipping burgers, but for precise scientific measurements, a tiny shift in the definition of the kilogram could cause big problems.

That issue nags at some researchers. They would prefer to define important units — including kilograms, meters and seconds — using immutable properties of nature, rather than arbitrary lengths, masses and other quantities dreamed up by scientists. If humans were to make contact with aliens and compare systems of units, says physicist Stephan Schlamminger, “we’d be the laughingstock of the galaxy.”

To set things right, metrologists — a rare breed of scientist obsessed with precise measurements — are revamping the system. Soon, they will use fundamental constants of nature — unchanging numbers such as the speed of light, the charge of an electron and the quantum mechanical Planck constant — to calibrate their rulers, scales and thermometers. They’ve already gotten rid of an artificial standard that used to define the meter — an engraved platinum-iridium bar. In 2018, they plan to jettison the Parisian kilogram cylinder, too.

Fundamental constants are among the most precisely measured quantities known and so seem ideal for defining the units. But the constants remain enigmatic. The fine-structure constant, in particular, has mystified physicists since it first emerged as an important quantity in scientists’ equations 100 years ago. Every time electrically charged particles attract or repel — anywhere in the universe — the fine-structure constant comes into play. Its value sets the strength of the charged particles’ push and pull. Nudge its value by a few percent and stars would produce much less carbon, the basis of all known life. Tweak it even more, and stars, molecules and atoms would not form. It seems as if its value were handpicked to allow life in the universe.

The values of other fundamental constants are likewise unexplainable — all scientists can do is measure them. “Nobody has any idea why these quantities have the numerical values they do,” says theoretical physicist John Barrow of the University of Cambridge.

This uncertainty swirling around the constants could cause headaches for metrologists. No law of physics prohibits the constants from changing ever so slightly in time or space — though scientists haven’t found conclusive evidence that they do. Some controversial measurements suggest, however, that the fine-structure constant may be different in different parts of the universe. That could mean that the constants used to define the units vary, too, which could muck up the orderly system that metrologists are preparing to adopt.

A hairline crack

The kilogram isn’t the only unit that gives scientists indigestion. A second culprit is the kelvin, the unit of temperature.

“It’s bonkers,” says physicist Michael de Podesta of the National Physical Laboratory in Teddington, England. “Humanity’s standard for temperature is the level of molecular jiggling at a mystical point.” That point — which much like the sacrosanct kilogram prototype is an arbitrary quantity chosen by humans — is the triple point of water, a particular temperature and pressure at which liquid, gas and solid phases of water coexist. This temperature is set to 273.16 kelvins (0.01° Celsius).

Then there’s the ampere, or amp, which quantifies the flow of electrical current that juices up laptops and lightbulbs. “We’ve agonized over the years on the definition of an ampere,” says Barry Inglis, president of the International Committee for Weights and Measures. The present definition is wonky: It is the current that, when flowing through two infinitely long, infinitely thin wires, placed one meter apart, would produce a certain amount of force between them. Such wires are impossible to produce, of course, so it is impractical to create a current that is precisely one amp in this way. This problem makes current-measuring equipment difficult to calibrate well. It’s not a problem for household wiring jobs, but it’s no good when the highest level of precision is needed.

These examples explain the discomfort that surrounds the units that are so fundamental to science. “There’s this hairline crack in the foundation, and you cannot build your building of physics on that foundation,” says Schlamminger, of the National Institute of Standards and Technology in Gaithersburg, Md.

To seal the crack, scientists are preparing to update the International System of Units, or SI, in 2018. The kilogram, kelvin, ampere and mole (the unit that quantifies an amount of material) will all be redefined using related constants. These include the Planck constant, which describes the scale of the quantum realm; the Boltzmann constant, which relates temperature and energy; the Avogadro constant, which sets the number of atoms or molecules that make up a mole; and the magnitude of the charge of an electron or proton, also known as the elementary charge. The new units will be based on the modern understanding of physics, including the laws of quantum mechanics and Einstein’s theory of special relativity.

Scientists went through similar unit acrobatics when they redefined the meter in terms of a fundamental constant — the speed of light, a speed that is always the same.

In 1983, the meter became the distance light travels in a vacuum in 1/299,792,458th of a second (SN: 10/22/83, p. 263). This long string of digits came from the increasingly precise measurements of the speed of light made over centuries. Scientists settled on a value of exactly 299,792,458 meters per second, which then defined the meter. The other units will now undergo similar redefinitions.

The shakeup of so many units is a “once in a lifetime” change, says NIST physicist David Newell. But most people won’t notice. Definitions will flip, but the changes will be orchestrated so that the size of a kilogram or a kelvin won’t change — you won’t have to worry about overpaying at the salad bar.

Although the changes are mostly under the hood, their advantages are more than philosophical. In the current system, masses much larger or smaller than a kilogram are difficult to measure precisely. Pharmaceutical companies, for example, need to measure tiny fractions of grams to dole out minute drug doses. Those masses can be a millionth the size of the kilogram prototype cylinder, increasing the uncertainty in the measurement. The new system will tie masses to the Planck constant instead, allowing for more precise measurements of masses both large and small.

 In 2018, at a meeting of the General Conference on Weights and Measures, metrologists will vote on the SI revision and will likely put it into practice. The new system is expected to be a welcome change. “Obviously, a system where you take a lump of metal and say, ‘this is a kilogram,’ is not very fundamental,” says physicist Richard Davis of the International Bureau of Weights and Measures in Sèvres, France. “Why would anyone spend their life trying to measure an atom in terms of that?”

The new kilogram

To retire the Paris-based prototype, scientists must agree on a number for the Planck constant. Its value is about 6.62607 x 10−34 kilograms times meters squared per second. But scientists need to measure it with extreme precision — within 2 millionths of a percent, out to about seven decimal places — and several independent measurements need to agree. Once that’s done, scientists will fix the value of the Planck constant. Since the meter is already defined (by the speed of light) and the second is defined (using a cesium atomic clock), fixing the value of the Planck constant will define what a kilogram is.

Several teams are using different techniques to zero in on the Planck constant (SN: 9/5/15, p. 15). The first compares electromagnetic forces to the force imposed by gravity, using a tool called a watt balance. Schlamminger’s group and others are in the final stages of refining such measurements. Thanks to precise quantum mechanical methods of producing voltages, an object’s mass can be directly related to the Planck constant.

In a complementary effort, scientists are measuring the Planck constant using carefully formed and stunningly shiny spheres of pure silicon. “The principle is quite easy,” says metrologist Horst Bettin of the German national metrology institute, Physikalisch-Technische Bundesanstalt in Braunschweig. “We are just counting the number of atoms.”

Atoms in the sphere are perfectly spaced in a 3-D crystal grid, so the number of atoms can be deduced from the volume of the sphere. The result, a measurement of the Avogadro constant, can then be used to calculate the Planck constant, using precise measurements of other fundamental constants — including the Rydberg constant, which is related to the energy needed to ionize a hydrogen atom. To make this measurement, the spheres must be impressively round so that the number of atoms inside can be calculated. “The Earth would be as round as our spheres if the highest mountain [were] a few meters high,” Bettin says.

Questioning the constants

Imagine a universe where the speed of light changes drastically from day to day. If metrologists’ new system of units were in use there, “today’s meter would be different than tomorrow’s meter,” says Schlamminger — clearly not an ideal situation. In our universe, however, no one has found solid evidence of variation, so if variation exists, it would be too small to have a practical impact on the units.

But if the constants weren’t constant, physics would be in trouble. The whole of physics is founded on laws that are assumed to be unchanging, says physicist Paul Davies of Arizona State University in Tempe.

Physicists have found possible signs of fickleness in the fine-structure constant (SN: 11/16/13, p. 11). If true, such measurements suggest that charged particles tug differently in far-flung parts of the universe.

The fine-structure constant is an amalgamation of several other constants, including the charge of the electron, the speed of light and the Planck constant, all smooshed together into one fraction, with a value of about 1/137. Its value is the same regardless of the system of measurement because the fraction is a pure number, with no units.

Scientists keep track of the fine-structure constant using quasars — brilliant cosmic beacons produced by distant supermassive black holes. On its way toward Earth, a quasar’s light passes through gas clouds, which absorb light of particular frequencies, producing gaps in the otherwise smooth spectrum of light. The locations of these gaps depend on the fine-structure constant. Variations in the spacing of the gaps in space or time could indicate that the fine-structure constant has changed.

In 2011, scientists reported tantalizing hints that the fine-structure constant changes. In Physical Review Letters, astrophysicist John Webb of the University of New South Wales in Sydney and colleagues reported that the fine-structure constant increases in one direction on the sky, and decreases in the opposite direction, as if there were a special axis running through the universe (SN Online: 9/3/10). The claim is controversial and Webb counts himself as one of the skeptics. “This is radical, obviously, and when you come out with a discovery like that, of course you don’t believe it.” But, he says, despite his best efforts to disprove it, the variation remains.

If confirmed, the result would have enormous consequences. “It’s a tiny effect,” says Davies, but “I think it would come as a great shock, because people really do want to think that the laws of physics are absolutely immutable. The idea that they could somehow vary makes most physicists feel seriously uneasy.”

Some scientists have come up with a more comforting explanation for the hints of variation in the constant. Michael Murphy of Swinburne University of Technology in Melbourne, Australia, suggests that telescope calibration issues could be to blame for the changing fine-structure constant. (Murphy knows the ins and outs of the measurement — he was a coauthor on the 2011 paper reporting fine-structure constant changes.) Using measurements free from calibration issues, the fine-structure constant stays put, Murphy and colleagues reported in September in Monthly Notices of the Royal Astronomical Society. The quasars studied for the September paper, however, don’t rule out variation in the part of the sky observed in 2011.

Other puzzles in physics might be connected to possible variation in the constants. Scientists believe that most of the matter in the universe is of an unseen form known as dark matter. In a paper published in Physical Review Letters in 2015, physicists Victor Flambaum and Yevgeny Stadnik of the University of New South Wales showed that dark matter could cause the fundamental constants to change, via its interactions with normal matter.

And a shifting speed of light could revise current views about the evolution of the infant universe. Scientists think that a period of inflation caused the newborn cosmos to expand extremely rapidly, creating a universe that is uniform across vast distances. That uniformity is in line with observations: The cosmic microwave background, light that emerged about 380,000 years after the Big Bang, is nearly the same temperature everywhere scientists look. But cosmologist João Magueijo of Imperial College London has a radical alternative to inflation: If light were speedier in the early universe, it could account for the universe’s homogeneity.

“As soon as you raise the speed limit in the early universe,” Magueijo says, “you start being able to work on explanations for why the universe is the way it is.”

A finely tuned universe

To the consternation of many physicists, whose equations are riddled with fundamental constants, these quantities cannot be calculated directly from physical principles. Scientists don’t know why electrons pull charged particles with the strength they do and can only measure the strength of the force and plug that number into formulas. Such black boxes detract from the elegance of scientists’ theories, which attempt to explain the universe from the bottom up.

Particularly troubling is the fact that the precise values of these constants are essential to the formation of stars and galaxies. If, during the birth of the universe, certain constants — in particular the fine-structure constant — had been just slightly different, they would set the cosmos on a path to being empty and barren.

As a result, many scientists believe that there must be some deeper theory that constrains their values. But recent attempts to come up with such a theory have been stymied, says theoretical physicist Frank Wilczek of MIT. “Progress has been pretty limited in the last few decades.”

Some scientists have begun turning to an alternative explanation: These constants may not be fine-tuned, but randomly chosen, with a roll of the dice that occurred many times in many universes, or different parts of the universe. “We’ve really changed our view of fundamental constants. They’re less rigidly defined and ultimate,” says Barrow.

There may be other universes, or faraway pockets of our own universe, that have very different constants. In those places, life might not be able to survive. Just as diverse life sprung up on Earth, with its favorable climate, and not Mars, we may just live in a universe that has constants amenable to life because that’s the only place where life could gain a foothold.

There’s also an increasing mismatch between what’s known experimentally and theoretically about the constants. Although scientists are measuring them to painstaking precision, with experimental errors measured in parts per billion, the origins of the constants remain completely unexplained.

As metrologists attempt to build their system on stronger foundations by pinning the units to the constants, those very foundations may yet shift. Shifting constants would make the system of units less appealingly neat and tidy. The system of units will have to evolve as knowledge of physics advances, says Newell. “Then, you can turn around and use that measurement system to explore further the world around us.”


This article appears in the November 12, 2016, issue of Science News with the headline, “Constant connections: New units based on fundamental properties of the universe will make measurements more precise.”

More Stories from Science News on Physics

From the Nature Index

Paid Content