Randomness, Risk, and Financial Markets

Pi, the ratio of a circle’s circumference to its diameter, is known as an irrational number because it can’t be exactly expressed as a ratio of whole numbers. It would take an infinite number of digits to write it out in full as a decimal or, in binary form, as a string of 1s and 0s. The square root of 2, the square root of 3, and the constant e (the base of the natural logarithms) fall into the same category.

The known digits of these numbers appear patternless. According to one novel method of assessing the randomness of a sequence of numbers, however, the digits of pi turn out to be somewhat more irregular than the digits of the other irrational numbers.

The measure used to determine the irregularity or degree of disorder (entropy) of these sequences is called the approximate entropy. Invented by Steve Pincus of Guilford, Conn., and developed in cooperation with Burton H. Singer of Princeton University, this measure characterizes the randomness of a sequence of numbers.

Suppose the data are expressed as a string of binary digits. The idea is to determine how often each of eight blocks of three consecutive digits—000, 001, 010, 011, 100, 101, 110, and 111—comes up in a given string.

Given the first 280,000 binary digits of pi, the most frequently occurring block is 000, which appears 35,035 times, and the least common block is 111, which appears 34,944 times. The maximum possible irregularity occurs when all eight blocks appear equally often.

For the square root of 3, the block 000 occurs most often (35,374 times) and 010 (34,615) least often. The greater divergence from exactly 35,000 occurrences means that the first 280,000 digits of root 3 are farther from maximum irregularity than the digits of pi.

The formula for approximate entropy developed by Pincus takes such data about a sequence of numbers, whatever its source, and assigns a single number to the sequence. Larger values correspond to greater apparent serial randomness or irregularity, and smaller values correspond to more instances of recognizable features in the data. Overall, approximate entropy grades a continuum that ranges from totally ordered to maximally irregular (or completely random).

Putting the four irrationals in order, starting with the most irregular, gives pi, root 2, e, and root 3. That’s a curious, unexpected result. Irrational numbers such as root 2 and root 3 are known as algebraic numbers because they are the solution to a polynomial with a finite number of terms. Others, such as pi and e, are known as nonalgebraic, or transcendental, numbers. Mathematicians had regarded algebraic numbers as, in some sense, simpler than transcendental numbers. But, according to approximate entropy, this distinction doesn’t show up in the irregularity of the digits. Whether such quirks in the irregularity of irrationals have any implications for number theory remains an open question for mathematicians.

Because the approximate entropy method does not depend on any assumptions about the process involved in generating a sequence of numbers, it can be applied to biological, medical, or financial data and to physical measurements, such as the number of alpha particles emitted by a radioactive element in specified time intervals, as readily as to the digits of irrational numbers.

For example, Pincus has looked at stock market performance, as measured by Standard and Poor’s index of 500 stocks. His calculations show that fluctuations in the index’s value are generally quite far from being completely irregular, or random.

One striking exception occurred during the 2-week period immediately preceding the stock market crash of 1987, when the approximate entropy indicated nearly complete irregularity. That change flagged the incipient collapse.

Now, Pincus and Rudolf E. Kalman of the Swiss Federal Institute of Technology in Zurich have applied approximate entropy to the analysis of a wide range of other financial data. They describe their findings in the Sept. 21 Proceedings of the National Academy of Sciences.

Approximate entropy “appears to be a potentially useful marker of system stability, with rapid increases possibly foreshadowing significant changes in a financial variable,” Pincus and Kalman contend.

To provide another example of such foreshadowing, Pincus and Kalman examined fluctuations in Hong Kong’s Hang Seng index from 1992 to 1998. In this case, the approximate entropy value rose sharply to its highest observed value immediately before this market crashed in November 1997.

Pincus and Kalman also show the usefulness of approximate entropy in characterizing volatility. Volatility is normally understood as the size of asset price fluctuations. A market with large swings in price is generally considered highly volatile and, hence, unpredictable. Pincus and Kalman argue that large fluctuations are not necessarily the same thing as unpredictability.

“The point is that the extent of variation is generally not feared; rather, unpredictability is the concern,” Pincus and Kalman say. “Recast, if an investor were assured that future prices would follow a precise sinusoidal pattern, even with large amplitude, this perfectly smooth roller coaster ride would not be frightening.”

Standard deviation remains the appropriate tool for characterizing deviations from centrality, the researchers say, and approximate entropy might well be the appropriate tool for grading the extent of irregularity (and unpredictability).

Use of approximate entropy to characterize disorder in financial time series data also suggests that random walks and related models don’t generally fit the actual behavior of markets. There’s often more order or structure in the data than such models allow.

“Independent of whether one chooses technical analysis, fundamental analysis, or model building, a technology to directly quantify subtle changes in serial structure has considerable real-world utility, allowing an edge to be gained,” Pincus and Kalman conclude. “And this applies whether the market is driven by earnings or by perceptions, for both short- and long-term investments.”

Puzzle of the Week

A jug contains 4 quarts of milk. The milk must be divided equally between two friends. But the only containers available are two empty bottles, one of which holds 2 1/2 quarts and the other holds 1 1/2 quarts. Using the jug and both bottles, how can the milk be divided equally between the two friends?

For the answer, go to http://www.sciencenewsforkids.org/articles/20031105/PuzzleZone.asp.

More Stories from Science News on Math