BOOK REVIEW: Turing’s Cathedral: The Origins of the Digital Universe by George Dyson

Review by Tom Siegfried

Computers are mathematically pretty powerful, considering the only numbers they use are 0 and 1. That power, of course, stems from binary digital logic, dimly foreseen by Francis Bacon four centuries ago and articulated more clearly by Leibniz several decades later. But the modern computer’s ability to exploit that power grew from the mathematical imagination of Alan Turing (SN: 6/30/12, p. 26) in work appearing a few years before World War II.

Dyson’s book dives deeply into the postwar development of Turing’s ideas under the direction of John von Neumann at the Institute for Advanced Study in Princeton, N.J. The institute’s computer, MANIAC, was not the first all-purpose digital electronic computer (that was ENIAC, at the University of Pennsylvania), but in Dyson’s telling it was the most influential. MANIAC combined binary numbers representing memory with those representing codes (programming) in a way that persists to the present day. “The entire digital universe, from an iPhone to the Internet, can be viewed as an attempt to maintain everything, from the point of view of the order codes, exactly as it was when they first came into existence, in 1951,” in Princeton, Dyson writes.

His book is full of insights into the evolution of computing in the modern era, as well as historical detail that sometimes seems excessive (it’s hard to say why you’d need to know the starting salaries of all the engineers who worked on the project, for instance). But for those interested in the history of technology placed in broader social context — from the use of the new computers to design hydrogen bombs to the computer’s implications for understanding the nature of life — Dyson’s book provides ample helpings of substance as well.

Pantheon, 2012, 401 p., $29.95

From the Nature Index

Paid Content