There’s a new way to quantify structure and complexity

Analyzing information sharing among the parts of a system can help explain its behaviors on different scales

In searching for sense in the complexities of nature, science has often found success by identifying common aspects of diverse phenomena.

When one principle explains how different things behave, nature becomes more comprehensible, and more manageable. Modern science took flight when Newton showed how one idea – universal gravitation – explained both the motions of celestial bodies and apples falling on Earth. Most important, it didn’t matter whether the apple was red or green, or even if it was an apple. Newton’s law described how everything else fell, from bricks to bullets.

But Newton’s gravity, and his laws of motion, and the rest of science built on that foundation had limits. Newton’s science couldn’t cope with really strong gravity, extremely fast motion or supertiny particles. Relativity theory and quantum physics helped with that. But there remains a realm where standard science has struggled to find unifying principles among different behaviors. That would be the kingdom of complexity, the universe of systems that defy simplification.

Such complex systems are everywhere, of course. Some are physical — the electric power grid, for instance. Many are biological — brains, bodies, ecosystems. And others are social — financial markets, interlocking corporate directorates, and yes, for God’s sake, Twitter.

It’s hard to find simple scientific principles from which to deduce all the multifaceted things that such complex systems do. But there is, for sure, one thing that they do all have in common. They all have a structure. And it’s by quantifying structure, three scientists suggest in an intriguing new paper,  that complexity can be tamed.

“Systems from different domains and contexts can share key structural properties, causing them to behave in similar ways,” write Benjamin Allen, Blake Stacey and Yaneer Bar-Yam.

Those structural properties depend on how a system’s parts are put together, and how some parts influence the actions of others. Describing all those relationships mathematically is the key to the complexity kingdom. But that is just the problem: Complex systems are too complex for the usual mathematical approaches.

“The set of system components for real-world physical, biological or social entities can be expected to be so intricate that, for all practical purposes, precise enumeration is ultimately intractable,” the scientists write. “Achieving a fundamental solution to this problem is critical for our ability to empower theoretical physics as a general approach to complex systems, and a practical solution is critical for our ability to address many real-world challenges.”

Many attempts at describing complexity mathematically have been made. Some have been moderately successful. Network math, for example, illuminates some systems by identifying links between the system’s parts (followers on Twitter, for instance, or being in a movie with Kevin Bacon). Cellular automata, shifting grids of white and black pixels, can reproduce many complex patterns found in nature by growing according to simple rules. Even the standard equations of calculus are useful in certain complex contexts. Statistical physics, for instance, applies standard math to averages of behavior in complex systems. Often this approach provides excellent insights, such as into how the changing relationships between parts of a material can suddenly alter its properties, say when a metal switches from magnetic to nonmagnetic or vice versa (an example of a phase transition).

But these approaches don’t always work. They’re useful for some systems but not others. And all methods so far suffer from an inability to quantify both complexity and structure meaningfully. By most measures, a box containing gas molecules bouncing around at random would be highly complex. But such a system totally lacks structure. A crystal, on the other hand, has a rigorous well-defined structure, but is utterly simple in the sense that if you know the locations of a few atoms, you know the whole arrangement of the entire object.

Supposedly, information theory provides the math for quantifying relationships among the parts of a system. “An information measure indicates how many questions one needs answered to remove uncertainty about the system components under consideration,” Allen (of Harvard), Stacey (Brandeis University) and Bar-Yam (New England Complex Systems Institute) point out. But traditionally, information measures have ignored the scale at which a system’s behavior is operating. Systems exhibit different behaviors on different scales. Only by incorporating the importance of scale can structure and complexity be properly accounted for, Allen, Stacey and Bar-Yam aver. An effective approach “requires an understanding of information theory in a multiscale context, a context that has not been developed in information theory nor in the statistical physics of phase transitions.”

Scale considerations are also often absent in the network approach, which emphasizes pairwise relationships (links) between two parts of a system. Network math can be tweaked to accommodate how pairwise links influence behaviors at higher scales, but it misses relationships that are intrinsically large-scale to begin with.

Working out the math to take scale more fully into account is at the core of Allen, Stacey and Bar-Yam’s new approach to coping with structure.

They define structure as “the totality of relationships among all sets of components … in the system.” Those relationships determine whether information about one part of a system is pertinent to other parts of the system. In other words, structure reflects the sharing of information among a system’s parts. “sharing” information just means that what one part is doing depends on what another is doing. It’s the scope of those dependencies that affects the scale of a system’s various behaviors.

When the parts of a system behave independently, they share little or no information, and behavior at small scales provides little information about the system as a whole. When behaviors of all system’s parts depend on its other parts, it exhibits large-scale behavior, indicating a lot of shared information among those parts.

Given all these considerations, Allen, Stacey and Bar-Yam composed foundational axioms and then derived two quantities to summarize the structure of a system. One, called the complexity profile, is high for systems with mostly independent parts and drops sharply for highly interdependent systems. The other, called marginal utility of information, quantifies how well a system can be described with a limited amount of information. Together these two measures avoid many of the problems and paradoxes that have afflicted previous approaches to quantifying complexity and structure.

“Already, we have found that our framework resolves key conceptual puzzles,” the scientists write.

They illustrate the usefulness of their approach by citing various real-world systems where their method can be applied. Genetic regulatory systems, for instance, exhibit lots of interdependence because the activity of one gene can produce a molecule that influences (either enhances or inhibits) the activity of another gene. Similarly, nerve cells in nervous systems fire signals that can encourage or inhibit the firing of other nerve cells. Financial markets can be viewed as systems of investors; in some cases the investors act mostly independently, at a small scale, but sometimes they engage in large-scale herd behavior. Allen, Stacey and Bar-Yam’s new measures can describe all of these scenarios. In all these cases, the key is structure — the relationships among the parts, without regard to any particular features of the parts themselves. 

It’s hard to say, of course, whether this new approach will go viral in the complexity world. But at first glance, at least, it seems a more sophisticated approach to at least one older issue in science: the problem of reductionism. Many scientists (and critics of science) have long clamored for a way to describe nature other than just by cutting it to pieces and studying how those pieces work in isolation.

“Over the past century, science has made enormous strides in understanding the fundamental building blocks of physics and biology,” Allen, Stacey and Bar-Yam write. “However, it is increasingly clear that understanding the behaviors of physical, biological and social systems requires more than a characterization of their constituent parts.”

It just may be that understanding how those parts compose structures by sharing information on different scales, really will be the key to unlocking the mysteries of how those complex systems behave.

Follow me on Twitter: @tom_siegfried

Tom Siegfried is a contributing correspondent. He was editor in chief of Science News from 2007 to 2012 and managing editor from 2014 to 2017.

More Stories from Science News on Science & Society