Wanted: Better Yardsticks

Measurement inadequacies threaten U.S. competitive edge

Match this, Hollywood: Some physicists are gearing up to make the ultimate action flicks. Their stories will chronicle the lives and loves—from marriages through messy divorces—of individual molecules and atoms. The challenge is to distill sharp and compelling images of actors that shimmy about with blinding speed, changing place every few quadrillionths of a second.

PLASTICIZED. With mostly plastic parts, a new, durable terahertz-radiation imaging system (right) is only the size and weight of a laptop computer (left). Previously developed T-ray imagers are fragile and weigh at least 100 pounds. RPI
BRAIN CUBES. Each of the four corner images displays the distribution of a different protein across the brain slice shown in the background. Adapted from R. Smith/PNNL

It’s difficult—but not, theoretically, impossible.

A 1,000-frame short subject debuted a few months back. Directed by David M. Fritz of Stanford University and described in the Feb. 2 Science, the flick portrayed 0.008 nanosecond of activity of atoms within crystalline bismuth.

Of course, such movies aren’t being developed for the entertainment industry, but for science. “One of the grand goals of our research is to make a movie of a chemical reaction,” Fritz explains. When two chemicals meet, a cascade of transient but important events occur before the final product emerges. Scientists need to measure the movements of atoms if they are to follow that action.

“To control reactions on the molecular scale,” Fritz says, “we need to know in detail what’s happening. And to do that, the ideal probe is a tool that can view atomic motion.”

Fritz is among researchers pushing the frontiers of science and engineering by developing new ways to measure things. Some of the novel yardsticks may be hardly recognizable as such. One, for example, might be a means to tally protein fragments across the brain to map when genes are turned on. From the mundane to the arcane, measurement systems serve as an engine of innovation, says a Feb. 12 report issued by the National Institute of Standards and Technology (NIST) in Gaithersburg, Md.

Although U.S. industry remains a global leader in product innovation, many nations are beginning to close the gap, according to a draft of a National Academy of Sciences report. Called Rising above the Gathering Storm, it says that U.S. technological leadership could end—and relatively quickly—if the nation’s research and development infrastructure isn’t shored up substantially and soon.

Measurement tools are a pivotal part of that infrastructure, argues NIST director William Jeffrey. “To measure is to know,” he says. “And knowledge, whether aimed at unraveling a fundamental law of nature or ensuring that a manufactured part will fit into its assembly, is critical to continued technical progress, to innovation, and ultimately to the economic security of the nation.”

Big tally

In an attempt to identify measurement problems that could stifle innovation, NIST recently collaborated with industrial partners and others to survey the U.S. Measurement System. This amorphous network comprises all the organizations that develop or make measurement tools, set standards for things that must be measured, or identify new quantities that need measuring.

Although everyone depends on it, the measurement system is “usually unseen,” the NIST report observes. Not surprisingly, it adds, any failure of the system to meet new societal and technological needs may also escape attention.

Over the past 2 years, as part of its assessment, NIST convened 15 workshops, met with more than 500 industrial leaders in 11 segments of industry, and scrutinized 164 technology road maps. The last, prepared by industry, chart the path to new products that would fulfill perceived or stated consumer wants. These road maps also identify measurement gaps that risk slowing or derailing planned products.

Altogether, NIST turned up more than 700 distinct measurement deficiencies—missing yardsticks. Then, “each of these perceived [measurement] needs was validated by going back out into the private sector—or at least to people outside NIST,” notes Belinda L. Collins, director of technology services at the agency.

Most of the yardsticks that NIST identified as missing proved far more prosaic than techniques to measure the movements of dancing atoms. Nonetheless, their absence may have costly consequences.

One road map’s goal, for instance, was lighter-weight but more-protective suits for firefighters. Before municipalities invest in new gear, they need concrete evidence that it will do its job, observes Shyam Sunder, NIST’s acting director for building and fire research.

In this case, “methods do not currently exist” to adequately gauge how much heat firefighters’ suits absorb, the report concludes. Also lacking are ways to correlate stored energy with a suit’s capacity to burn skin. Firefighters can sustain burns even when their protective suits show no degradation, Sunder says.

Measurement deficiencies could cost the $270 billion U.S. semiconductor industry its competitive edge. Within 2 years, computer-chip developers expect to produce integrated circuits having features a mere 10 nanometers across. However, there are currently no assembly-line tools that can examine features smaller than 32 nm, the NIST report found. Such tools would be needed to detect defects or contaminants in the automated manufacturing of chips.

A lack of variety in simple sensors hobbles other emerging technologies, such as the proton-exchange-membrane fuel cells (SN: 9/7/02, p. 155:

Pocket Sockets) being developed to power cars. To operate efficiently, these fuel cells need sensors to measure and control their internal humidity. However, the sensors now available were developed for applications that take place in cool, dry environments. NIST found that in the fuel cells, these sensors are “error prone when exposed to water droplets commonly found near optimum operating points.”

Price tags

The cost of developing a new measurement tool can be modest—or it can run to half a billion dollars.

For instance, an engineer at the Rensselaer Polytechnic Institute in Troy, N.Y., used ingenuity and about $100,000 to render the technology for terahertz-radiation (T ray) imaging more versatile. He took advantage, however, of equipment worth about $2 million.

T rays have a frequency of about a trillion cycles per second—higher than that of standard microwaves but below that of infrared radiation. T rays can penetrate and measure anomalies in many dry, nonmetallic materials, such wood, glass, plastic, and carbon fibers (SN: 8/26/95, p. 136). Until now, however, devices that produce those rays weighed at least 100 pounds and had delicate optics.

Brian Shulkin has developed a largely plastic alternative. It’s rugged, weighs just 4.5 pounds, and is the size of a laptop computer. In tests by NASA, Shulkin’s prototype spotted and measured hidden flaws in the foam used to insulate the space shuttle. The system’s portability is ideal for scanning massive objects such as the shuttle. Or the system could be mounted above an assembly line to scout for flaws in the silicon wafers used to make computer chips. A commercial version may debut this spring.

On the other end of the financial spectrum are massive projects from which users won’t likely recoup their investments. The federal government often bankrolls such efforts. Fritz’ atomic-movie camera is one example.

Its simplest ingredient is a table-top laser that excites the atoms in a bismuth crystal. The hard part of the project was producing and harnessing X-ray pulses short enough to make snapshots of the atoms’ dance steps. At the 2-mile-long Stanford Linear Accelerator, Fritz’ team installed a system that turns the accelerator’s brief but energetic electron bursts into X-ray pulses a mere 100 femtoseconds long—short enough to snap shots of the atoms’ motion.

The scientists labeled the snapshots with virtual time stamps indicating the lag from the time the X-ray pulse excited the atomic motion. They then arranged the snapshots into the appropriate sequence to create a movie of the atomic dance. Bismuth atoms move slower than many others, so the researchers plan to refine their system to follow faster dancers.

The U.S. Department of Energy is already building a free-electron X-ray laser more than a mile long at the Stanford facility. It will both excite atoms and measure their activity. This will considerably reduce the arduous computational requirements of atomic moviemaking, Fritz explains. The $400 million system is slated to begin operation in 2009.

Challenges

The NIST report might prove sufficient to catalyze development of many missing yardsticks—especially those with moderate development costs. However, in some cases, the technology isn’t yet available to hurdle measurement barriers.

Jeffrey points out one example: proteomics.

Whereas the genome represents an organism’s complete set of genes, the proteome constitutes the entire suite of proteins produced by those genes. Although all cells within an organism possess the same genes, the sets of proteins produced by gene activity differ dramatically not only from tissue to tissue but even from hour to hour.

Protein anomalies can offer clues to disease. Jeffrey notes that the National Institutes of Health has identified the absence of a means to reliably measure proteins at “very, very low concentrations” as a barrier to innovation in health care.

However, moves are afoot to begin addressing the problem. One major NIH-funded project is under way at the Energy Department’s Pacific Northwest National Laboratory in Richland, Wash.

To correlate neurological disease with protein differences across the brain, scientists need a system to compare proteins from the tissues of a healthy animal against those from an animal with the disorder.

As a test of the concept, a team led by Richard D. Smith of the Richland lab in collaboration with Desmond Smith of the University of California, Los Angeles recently analyzed 1-millimeter cubes from brain sliced from two normal mice. Proteins in just one of the animals had been labeled with a heavy isotope.

Cube by cube, “we broke each protein into pieces—peptide fragments. It’s those we measured,” Richard Smith explains. In the March Genome Research, he and his colleagues report what they say is the first quantitative comparison of the relative abundance of proteins—many with unknown functions—from different areas across an organ. More importantly, he notes, the method tallied the proteins that were in fact being made, not merely those that could be made by the cells’ genes.

The analyses took heroic processing. The team has since developed instrumentation that has speeded the process 25-fold. The next step, Richard Smith says, will be to apply the technique to tissues from normal and diseased animals—and potentially to far smaller cubes, to obtain finer spatial resolution of any protein differences.

Optimistic outlook

Jeffrey argues that “the health of the U.S. measurement system is actually fairly strong.” Nevertheless, the NIST report identified 723 measurement needs. “And this was by no means meant to be a complete survey,” he notes.

An important question is whether the agency’s findings represent unrelated deficiencies or instead many stemming from some underlying problem and so can be addressed collectively. Once NIST understands that, Jeffrey says, the United States “can begin to focus our investments” to overcome the problems.

The good news, he says, is that “all of the feedback that we’ve gotten from industry has been, quite literally, very positive.”

Overall, Jeffrey says, through the new report and other efforts, “I think we’ve been making a lot of headway in educating people on the importance of measurements and standards. So, I’m very bullish.”


Janet Raloff is the Editor, Digital of Science News Explores, a daily online magazine for middle school students. She started at Science News in 1977 as the environment and policy writer, specializing in toxicology. To her never-ending surprise, her daughter became a toxicologist.