In no field of science is the gulf between appreciation and importance as wide as it is for metrology.
It’s not about the weather. Metrology is the science of measuring. It has a longer history than the modern sciences taught in school, and it’s essential to all of science’s usefulness and power. Without sound metrology, there’d be no trips to the moon, no modern medicine, no self-driving cars, no baseball analytics and no decent weather forecasts. (OK, so sometimes it is about the weather.) And even without science, metrology has earned its keep for millennia in the service of trade and commerce, ensuring that weights and volumes of produce and other products could be standardized to make fraud a little harder for the fraudsters.
May 20 marks the latest high point in metrology’s long history, with the official adoption of new definitions for some of science’s most important measuring units, including the kilogram, the standard measure of mass. Those changes reflect revisions in the Le Système International d’Unites (or SI), the modern version of the metric system. As overseen by the Bureau International des Poids et Mesures, SI is based on seven “fundamental” units from which other units of measure are derived. Besides the kilogram, newly defined basic units include the kelvin (for temperature), ampere (electric current) and the mole (quantity of material). Unchanged are the second (time), meter (length) and candela (luminous intensity).
This latest SI revamp represents an advance for science, but it’s only the latest of several historic landmarks for metrology. There are too many to list, but that’s no reason not to count down the top 10 momentous metrology moments of all time (or at least for as long as time has been measured).
10. Invention of anatomical units (a long time ago)
Anatomy-based units originated with early human civilizations, probably near the time of the beginning of agriculture. Units of volume, such as the mouthful and handful, preceded later units such as tablespoons, cups and pints. For length, human feet have been around as long as humans; the “foot” used by the ancient Egyptians was a tad under 12 (modern) inches. But even as late as the 1700s, the foot varied in some countries from as little as 10 modern inches to as long as 14.
Of the other early anatomical units, the cubit — based on the length of a forearm — was once the most widely used. It probably originated in the Middle East and was mentioned in the Epic of Gilgamesh, written before 2000 B.C. By most accounts, the cubit was supposed to be the length from the elbow to the tip of the middle finger, but that of course led to various actual lengths, ranging from under 18 inches to more than 25. Still, the cubit was an important and common unit throughout ancient times, and was very helpful in building arks.
Subscribe to Science News
Get great science journalism, from the most trusted source, delivered to your doorstep.
It may be that the “double cubit” morphed into the yard. King Henry I of England, who reigned from 1100–1135, attempted to standardize the yard by defining it as the length from the tip of his nose to the end of his thumb (with his arm stretched out). Eventually the yard became three feet, a foot being 12 inches, an inch defined as the length of three barley grains placed end-to-end lengthwise. So the anatomical unit became a derivative of a botanical unit.
9. Magna Carta, 1215
One of the most important documents in the history of government established the necessity of metrology for the future of civilization, insisting that “throughout the kingdom there shall be standard measures of wine, ale and corn,” and the same for weights as for measures. It didn’t exactly work out that way for the next few centuries, but the principle was clear enough and subsequent metrologists have done a pretty good job of achieving the Magna Carta’s goal. Well one of its goals. It had others.
8. Queen Elizabeth I reforms system of weights, 1588
While her fleet was busy destroying the Spanish Armada, Queen Elizabeth I of England was busy establishing more rational rules for weights and measures. Before then, English merchants dealt with a bunch of different kinds of pounds, today’s “avoirdupois” pound among them. One, the “tower” pound, was abolished by Henry VIII in 1527 in favor of the troy pound for use in currency (hence pounds are English currency still, even when made of paper).
Elizabeth established the standard avoirdupois pound for most uses, retaining the troy pound for coins (and drugs). In so doing she set people up for the clever trivia question: Which weighs more, a pound of gold or a pound of lead? Quick-witted respondents often answer “Ha! Neither! A pound is a pound.” But those sufficiently versed in metrology say “lead,” because an avoirdupois pound weighs more than a troy pound. (But if you say an ounce of lead weighs more than an ounce of gold, wrong again. A troy ounce of gold is heavier. An avoirdupois pound is heavier because it contains 16 ounces, but a troy pound has only 12 troy ounces.)
7. Pendulum clock, Christiaan Huygens, 1656
Others (Galileo among them) had played around with pendulums as timepieces, but Dutch physicist and mathematician Christiaan Huygens built the first reliable pendulum clock. His earliest version, built in 1656, was accurate to about 15 seconds per day, a big improvement for the times. Further advances in pendulum clocks made them the most accurate timepieces until the 20th century.
6. Metric system, 1799
In the 17th century, certain prescient savants recognized that a decimal system of units would be vastly superior for science and commerce than the then current hodgepodge of units that varied from country to country. Or even within the same country — in fact, some have suggested that one of the reasons for the French Revolution was dissatisfaction among the people with lack of weight and measure uniformity.
In the 1670s, the French clergyman Gabriel Mouton and astronomer Jean Picard (middle name unknown, probably not Luc) both discussed basing the basic unit of length on the length of a pendulum with a period of 2 seconds. (That’s pretty close to today’s meter, but unfortunately a pendulum’s period varies from place to place on the Earth’s surface.) But in the 1790s, when the French got serious about establishing the metric system, they defined the meter to be one 10-millionth of the distance from the equator to the North Pole. Other units were then related to the meter — a gram (mass) being equal to the mass in a cubic centimeter of water, for instance.
The metric system had its faults, but it made measurement much more rational and standard than it had previously been. Today only backwards countries (like Liberia, Myanmar and one other) don’t use its successor, the Système International.
5. Bureau International des Poids et Mesures (International Bureau of Weights and Measures) created, 1875
The Convention du Mètre in 1875 established the weights and measure bureau as arbiter for unit issues in a treaty signed by 17 countries (on May 20 — see Gibbs Rule 39). The treaty specified that the bureau (under the oversight of the Conférence Générale des Poids et Mesures (General Conference on Weights and Measures) should undertake production of standard prototypes for the meter and kilogram. It was an important step toward establishing worldwide use of the metric system, a goal almost successfully reached but for the intransigence of a certain large country in the middle of North America.
4. Kelvin temperature scale
Before the 19th century, temperature was a slippery concept — thermometers used arbitrary units that allowed a measurement to determine whether one thing was hotter than another, but not how much hotter. In 1848, William Thomson, to become Lord Kelvin, proposed applying the principles of the new science of thermodynamics to devise a rational “absolute” temperature scale that would establish a zero point corresponding to the complete absence of heat. It took a while for thermodynamics to mature before the scale could be completely understood, but it eventually put thermometry on a solid foundation. Temperature units are now called kelvins, rather than degrees (a change from “degrees Kelvin” made in the late 1960s by the Conférence Générale des Poids et Mesures).
3. Michelson interferometer
Albert A. Michelson was obsessed with measuring the velocity of light, and in the late 1870s, he measured it more precisely than anyone else ever had. Shortly thereafter he realized he could detect small differences in light’s velocity caused by the Earth’s motion through the ether. To do so he invented an interferometer. It split a light beam into two paths, perpendicular to each other, and then recombined the two beams with mirrors. A velocity difference between the two paths would mean the light waves could misalign, creating an interference pattern. Michelson and his colleague Edward Morley tried the experiment in 1887 and it failed to detect the expected interference. But that’s because there’s no ether. Interferometry was a great idea and became a valuable tool for all sorts of metrological uses.
The discovery in 1960 of lasers made Michelson’s interferometry even more precise, thanks to lasing’s control over light’s wavelength. So lasers not only provided the realization of science fiction ray guns but also quickly became the best measuring devices in history. Lasers make ultraprecise distance measurements possible for everything from the dimensions of a room to how far it is from Earth to mirrors left on the moon by Apollo astronauts. Lasers enable the construction of optical clocks that are thousands of times more precise than Huygens’ pendulum clock. Laser metrology also helps verify that things like airplanes and automobile engines have been manufactured to exact design specifications. And where would baseball analytics be without the use of laser radar guns to check fastball velocities? (Just for fun, you can also use laser interferometry to detect gravitational waves.)
1. Basic units redefined, 2019
In 1983, the metrology czars redefined the meter in terms of how far light can travel per second. That was the first step toward the new definitions of other units, based on fundamental physics, adopted officially on May 20. The kelvin, for instance, is now defined with use of a constant based on the kilogram, meter and second. The kilogram is now based on the quantum physics quantity known as the Planck constant plus the definitions of the meter and second. The second is still based on radiation emitted by a specific process in a certain cesium atom. Metrology is now not just standardized for all countries, but for all planets in all galaxies, no matter how far away.
That doesn’t render all uses of metrology immune to criticism, of course (think baseball analytics). And just remember, if you are tempted to criticize the omission of certain metrology items from this list, there is no way yet devised to precisely measure the proper ranking of metrology landmarks.
Follow me on Twitter: @tom_siegfried