Quarks are the smaller-than-a-proton particles without which there would be no stars, dogs, or breakfast burritos. In 1986, after a dozen frustrating years of trying to find ways of using computers to calculate properties of quark-containing entities such as protons and neutrons, Kenneth G. Wilson threw in the towel at a physics meeting. Wilson, who had already won a Nobel prize for previous work in another branch of physics, had been trying to make realistic predictions using the mathematically unwieldy theory of quark physics known as quantum chromodynamics (QCD). He had even invented a computational technique, called lattice QCD, to do just that. Bemoaning the dearth of computing power available at the time, however, he concluded that his approach just wasn’t worth pursuing.
Wilson declared the field “dead,” says physicist G. Peter Lepage of Cornell University.
Now, decades after Wilson devised the lattice-QCD technique, souped-up computer power and improved understanding of QCD theory are making him eat his words. At a meeting on lattice QCD in June, Wilson offered to “pay penance for claiming in ’86 that there was a long desert ahead.”
Researchers can now do some calculations with long-awaited accuracy. So promising are the results that theorists may soon for the first time make predictions that can be tested by experimentalists working at a large particle collider. In the future, theorists may also provide experimenters with values to use in determining whether their data fits the current understanding of the world.
Scientists look forward not only to filling in missing details of the known world of particle physics but also to perhaps finding unknown particles or processes. Such discoveries could answer grand questions about the universe, such as whether there are extra, unseen dimensions of space.
In the past year, a cadre of lattice-QCD theorists has demonstrated a way to dramatically boost the precision of specific calculations. Uncertainty has dropped from around 20 to 30 percent to less than 3 percent. The difference is akin to knowing that someone lives on a certain floor of an apartment building versus recognizing only that he or she lives in the building.
“We seem to have turned a corner,” declares theorist Andreas S. Kronfeld of the Fermi National Accelerator Laboratory (Fermilab) in Batavia, Ill.
“The potential of this is astounding,” agrees experimentalist Ian P. Shipsey of Purdue University in West Lafayette, Ind., leader of a team using an accelerator at Cornell University to measure particle properties that will put the refined lattice QCD to the test.
Quantum chromodynamics is the theory of fundamental particles—quarks, gluons, and antimatter counterparts of quarks called antiquarks—that swarm inside protons, neutrons, and dozens of other elementary particles (SN: 5/24/03, p. 333: Available to subscribers at New particles pose puzzle). That theory is complicated because the fundamental particles team up in many ways and exhibit some astounding behaviors.
Particles called baryons, such as protons and neutrons, contain three quarks or antiquarks each. Mesons, such as kaons and pions, on the other hand, contain one quark and one antiquark each. These arrangements of quarks and antiquarks, both of which scientists lump into the category of quarks, are held together by gluons that they exchange. Gluons convey a fundamental, mighty force of nature known as the strong force. In a meson, the members of a quark-antiquark pair tug at each other with a force of up to 14 tons, says theorist Michael Creutz of Brookhaven National Laboratory in Upton, N.Y.
To complicate matters, mesons don’t stay as they are. In the quantum world, particles of all kinds regularly morph into other sorts of particles and then switch back again, Lepage notes.
Furthermore, pairs of quarks and antiquarks randomly materialize out of the seeming emptiness of space, forming a sea of short-lived particles that interact with the other quarks and gluons already present in baryons and mesons. Of the six types of quarks—up, down, strange, charm, bottom, and top—the first three, which are the lightest ones, most readily appear in this seemingly magical way because their formation requires the least energy to form.
The equations of QCD theory describe this multitude of potential interactions and transformations of particles. But using the theory is, in Shipsey’s words, “fiendishly difficult.”
So forbidding have these computations been that the only hope for even modest success has been simplification. Physicists typically have distilled QCD theory into more-manageable models by approximating aspects of it (SN: 8/27/94, p. 140).
Lattice QCD explores the particle realm by taking a different tack. It simulates quark and gluon behaviors by applying the full QCD theory to a tiny grid like facsimile of the space-time in which particles actually interact (SN: 1/6/96, p. 5).
The trick is to specify a grid that is coarse enough to limit the time that calculations require but not so coarse that the results lose precision. The overall volumes of the biggest grids manageable now are on the scale of small atomic nuclei.
When Wilson invented lattice QCD in the early 1970s, he immediately scored a major success. Whereas physicists suspected that the strong force prevents quarks from roaming around alone—a restriction known as confinement—they could neither prove nor explain it.
Wilson, now at Ohio State University in Columbus, used a pencil-and-paper approach to lattice calculations in 1974 to investigate the then-nascent QCD theory, which had been invented by his former advisor, Murray Gell-Mann of the California Institute of Technology in Pasadena. Wilson’s mathematical results suggested that a cord of gluons tethered together quarks. In one stroke, his new lattice approach demonstrated that quark confinement arises naturally from QCD, a result that later computer simulations supported.
“I had no idea something like that would come out of it,” Wilson told Science News at a Fermilab meeting in June of some 300 lattice-QCD theorists.
Apparently, that valuable result was beginner’s luck. Over the next 30 years, few significant insights emerged from lattice QCD, although Creutz and his coworkers at Brookhaven had figured out by 1979 how to adapt Wilson’s approach to computers, giving birth to the simulations that define the field.
But computers just weren’t up to the job. So, theorists using lattice QCD made compromises, such as portraying the properties of particles in deliberately incorrect ways that were more computationally friendly. At first, they ignored the quarks and antiquarks that popped out of nowhere. “That’s a brutal mutilation that we had to do just in order to do anything,” says Matthew B. Wingate of the University of Washington in Seattle.
As computer power improved, researchers reintroduced those spontaneously generated quarks, with fraudulently heavy masses. That way, the spontaneous quarks would roam around less than light ones would and have fewer encounters with other particles, Creutz explains. This also slashed the computer time required to compute the quarks’ interactions.
This year, lattice-QCD theorists appear to be picking up momentum. These researchers modified a trick devised in the late 1970s that now enables them to include in their calculations spontaneous quarks that are almost as light as actual quarks.
In the Jan. 16 Physical Review Letters, Kronfeld, Wingate, and 24 of their colleagues presented the most accurate calculations ever of properties of quark-containing particles. The collaborators came up with values for nine quantities, most of which are related to particle decays and energy states of a variety of mesons and baryons. Those results differed by no more than 3 percent from values determined from experiments at particle accelerators.
At the Fermilab conference, Douglas Toussaint of the University of Arizona in Tucson and Christine T.H. Davies of the University of Glasgow in Scotland, who are coauthors of the January report, added to the list of precisely calculated properties the first mass of a particle containing three quarks. This particle was a short-lived baryon known as the omega-minus.
To include light quarks in their calculations, Davies and her colleagues distributed representations of these quarks over many points on the lattice. Theorists call that trick, devised to speed calculations, “staggering” the quarks. It’s as if the particles lurch from site to site, “like a drunk might stagger,” jokes Wingate.
However, staggering had a grave drawback: Simulation results changed as the lattice spacing changed. That’s a no-no, Davies says, because the laws of physics shouldn’t vary just because an observer looks more or less closely at some process.
In the late 1990s, tipped off by some work by Toussaint, Lepage found the cause of staggering’s sensitivity to lattice spacing. Each of the lattice sites occupied by a staggered quark was exchanging high-speed gluons with the other sites, a faux interaction that Lepage then minimized.
Finally exploiting the full, revved-up computing speed that staggering quarks allows, the researchers could work with quark masses lowered toward more realistic values. That, in turn, dramatically improved the accuracy of their simulations.
In the wake of the new results, Wilson, now only a spectator of lattice QCD, told hundreds of attendees at the June meeting at Fermilab that he had misjudged the field when he wrote it off in 1986. Now, it’s an exciting time for the people in lattice QCD, he says.
The proof is in the particles
If lattice QCD indeed is coming of age, then it’s doing so at a propitious moment. There’s an opportunity for the method to have valuable interactions with experiments that are under way.
Since the late 1990s, physicists have staged an unprecedented, international effort to determine properties of B mesons—those mesons that contain the second-heaviest type of quark, the bottom or b quark. Large particle accelerators in California and Japan are participating in that work (SN: 3/3/01, p. 143: Available to subscribers at Physicists get B in antimatter studies).
The ultimate goals of those “B factories” are to test aspects of the prevailing theory of particle physics, called the standard model, and to find evidence for new phenomena. Signs of previously unknown families of exotic particles or even extra dimensions of space could emerge from such investigations, notes Sheldon Stone of the University of Syracuse (N.Y.) and a leader of a new B physics experiment planned for Fermilab.
“The study of B decays is really the study of new physics,” he says.
Ideally, physicists would study b quarks isolated from all other particles because it’s in the details of the disintegrations of those quarks themselves that researchers expect any signs of new phenomena to be most apparent. However, quarks are invariably confined within other particles, such as mesons.
So, physicists must turn to theory to distinguish those features due exclusively to the decay of the b quark from the elaborate dance of quarks, antiquarks, and gluons inside a decaying B meson. Once the scientists have mathematically isolated the breakdown of the quark, they might then discern new physical phenomena that influence the quark’s decay. Unfortunately, there has never been a computational technique that could do that job.
This past year’s precise lattice calculations indicate that such a method could be in the offing for B meson studies. Physicists could use lattice-QCD techniques to characterize the particle riot going on inside of the meson. They would then compare those values with values derived from actual measurements of the decay rates of B mesons at accelerator facilities. With these experimental and theoretical values in hand, physicists could then tease out a measure representing b-quark decay alone. This high-precision value could then be compared with predictions from the standard model.
If the numbers don’t match, there would be reason to suspect that the accelerator measurements were reflecting some phenomenon that lies beyond the known world of subatomic particles.
Tantalizing as that plan might sound, it will take some doing to convince many physicists that this approach has merit. Several objections have been raised to the precision-lattice-QCD approach.
For example, in modifying the staggering trick of lattice QCD, says Thomas A. DeGrand of the University of Colorado in Boulder, the technique’s developers rely on a mathematical operation that may deny the local nature of space and time. If so, quark properties at one point in the lattice may depend on properties at other locations as well. “To measure the temperature at one point in a room, you place a thermometer at that point. You don’t need to look at other points,” DeGrand notes.
Davies asks for patience. Although “we haven’t quite answered all the questions, I’m quite confident that they will be [answered] in time,” she says.
Other researchers would rather rely on methods that would achieve precision without this controversial modification, DeGrand adds. However, unless other shortcuts are found, those calculations must wait for a hundredfold boost in computing speed.
What’s more, the recent apparent success of lattice QCD lacks punch, Creutz says, because the collaboration that generated it “chose the easiest stuff. There are a lot of hard problems out there that they didn’t try.”
“Eventually [lattice-QCD calculations] will be really useful,” says Eric S. Swanson of the University of Pittsburgh, a QCD theorist who doesn’t work with lattice simulations. “They [just] haven’t told us anything new yet,” he says.
In the meanwhile, lattice-QCD has to restore its image. In the early 1980s, notes Kronfeld, some lattice-QCD researchers made predictions that didn’t pan out, leaving experimentalists with lingering suspicions regarding the method. “Some of us who are a little older didn’t believe we’d ever get an answer from these guys that we could believe,” says Stone, an experimentalist.
“For a long time, it was right to call lattice QCD the black sheep of high-energy physics,” adds Kronfeld.
That stigma may be removed by a new series of accelerator experiments slated to begin this fall at Cornell University. It’s especially well suited for trying out the new lattice-QCD approach. Instead of B mesons, the Cornell physicists are investigating the properties of D mesons, which contain the charm- or c-quark.
Physicists have already pieced together a fairly accurate picture of crucial details of c-quark decays that remain obscure for b quarks. As a result, experimental physicists will use the Cornell results and theorists will use lattice-QCD techniques to independently determine the contributions from the particle commotion inside of D mesons.
“There’s a wonderful opportunity here,” Shipsey says. If the theorists’ and experimentalists’ results end up agreeing, “we [will] have really cracked the code of the strong [force] interactions,” he adds.
Because the results of the experiments should be available by next summer, some lattice-QCD theorists are rushing to apply their new methods to D mesons and to publish predictions of decay rates and other parameters that the Cornell-based team will measure. “QCD has never been in a race with experiment before, ” notes Lepage, “but now it is.”