Science past and present

Tom Siegfried



Happy Birthday to Boole, with 11001000 binary candles

The widespread use of the binary digits 1 and 0 in computer information processing is one of the legacies of investigations into the mathematical representation of logic by the English mathematician George Boole, who was born 200 years ago November 2.

Sponsor Message

It’s almost time to celebrate George Boole’s 200th birthday. Or maybe we should call it Birthday No. 11001000.

You might have a hard time converting that binary number to decimal in your head. But it would be a snap for computers, which store numbers and manipulate them exclusively as 1s and 0s. Of course, everybody knows that. But not so many know that you have Boole to blame. Or thank.

Boole was the mathematician with a vision for transforming logical thinking into mathematical formulas. His book An Investigation of the Laws of Thought, published in 1854, established him as one of the most creative mathematical thinkers of his century. The mathematician-philosopher Bertrand Russell, in fact, considered Boole’s book to be “the work in which pure mathematics was discovered.”

Boole started young. Born on November 2, 1815, in Lincoln, England, he grew up without much education but an abundance of intelligence. During his youth he spent a lot of time in libraries, mastering everything he could about math. Before he turned 20 he was widely known in Lincolnshire as an expert on Newton. Soon Boole began publishing papers in math journals; at age 34, he got the job as math professor at Queen’s College, Cork in Ireland, even though he had never earned a degree in math, or in anything else.

Boole’s most creative idea was to express true and false propositions by numbers, and then crunch the numbers algebraically to distinguish correct from incorrect deductions. In his famous book he worked out the details of his ambition to merge logic with mathematics.

Boole’s book is not light reading. He explains his reasoning in rather tedious detail. But that reasoning was powerful enough to perceive the role that binary numbers — numbers expressed using only the two digits 0 and 1 — could play in representing human thought algebraically.

He began with the notion that various concepts could be represented mathematically by algebraic symbols. For instance, x could represent “men.” Then y could represent “all white things.” To refer to all nonwhite men, then, you could use the expression x minus y.

Various expressions representing other concepts could be mathematically combined to reach irrefutable logical conclusions, Boole believed.

He expended considerable intellectual effort to formulate the rules for making such computations. One rule, for instance, designated the meaning of multiplication: multiplying two symbols yielded all the members of the group that met the definition for both of the symbols.

So assigning x to all men and y to white things implies that x times y equals all white men. That seems reasonable enough. But Boole was astute enough to realize that his multiplication rule could pose problems. If you multiplied x times x, for instance — all men times all men — the answer would be “all men.” In other words, x times x, or x squared, equaled x. At first glance, that appears to be a logical inconsistency, kind of unsatisfactory when your whole plan was to implement a foolproof mathematical plan for doing logic. But Boole thought more deeply and saw that sometimes, x squared does equal x. And in fact, x squared always equals x for two numbers: 1 and 0. Voilà! Computing logical relationships algebraically — the sort of things computers do today — works just fine if you use binary numbers, 1 and 0, in the calculations.

Boole developed from this insight the system of “Boolean logic” that (with the embellishments of many mathematicians who followed him) underlies the logic gates used in modern computer chips, relying on the “truth values” of 1 for true and 0 for false.

Besides his work on logic, Boole wrote extensively on the (not unrelated) mathematics of probability. He eventually wanted to combine his insights into a more fully developed philosophy based on a deep understanding of how math and logic captured human thought. 

Boole’s most ambitious goals were never completely realized, of course. Computers now are sometimes portrayed as “thinking machines” that embody Boole’s logical principles. But even today’s best computers can’t outthink the best humans (although computers are faster, and certainly beat the hell out of the worst humans).

Besides that, there is the problem that logic, too rigidly applied, can lead to tragedy. Boole, for instance, died young, at age 49, after walking two miles through a rainstorm and then lecturing while still all wet. That was followed by pneumonia. Perhaps the bedridden Boole might have survived. But his wife, applying the logical principle that a remedy resembles its cause, doused his bed with bucketfuls of water. (At least that’s the legend.) Consequently, Boole did not live to see 110010.

Follow me on Twitter: @tom_siegfried

Science & Society,, Numbers

Unreliable science impairs its ability to serve society

By Tom Siegfried 5:53pm, October 28, 2015
Science’s reproducibility problem impairs the ability of basic research to inform the search for better medicinal drugs.
Quantum Physics,, Cosmology

Quantum interpretations feel the heat

By Tom Siegfried 8:00am, October 25, 2015
Landauer’s principle shows a way to test competing interpretations about quantum physics.
Particle Physics,, History of Science

Top 10 subatomic surprises

By Tom Siegfried 4:52pm, October 6, 2015
Nobel Prize–winning neutrinos rank among science’s most unexpected discoveries.
History of Science

The amateur who helped Einstein see the light

By Tom Siegfried 6:00am, October 1, 2015
With help from Science News Letter, eccentric amateur Rudi Mandl persuaded Einstein to explore the phenomenon of gravitational lensing.
Numbers,, Science & Society

Evidence-based medicine lacks solid supporting evidence

By Tom Siegfried 6:00am, September 14, 2015
Saving science from its statistical flaws will require radical revision in its methods
Numbers,, Science & Society

Top 10 ways to save science from its statistical self

By Tom Siegfried 9:00am, July 10, 2015
Saving science from its statistical flaws will require radical revision in its methods.
Numbers,, Science & Society

Science is heroic, with a tragic (statistical) flaw

By Tom Siegfried 11:29am, July 2, 2015
Science falls short of its own standards because of the mindless use of ritualistic statistical tests.
History of Science,, Numbers

Nash’s mind left a beautiful legacy

By Tom Siegfried 3:50pm, May 24, 2015
The death of game theory pioneer John Nash ends a dramatic story of genius.

Nobel laureate foresees mind-expanding future of physics

By Tom Siegfried 12:01pm, May 5, 2015
A Nobel laureate forecasts deeper understanding of physics and new powers for the human mind in the century to come.
History of Science

Old periodic table could resolve today’s element placement dispute

By Tom Siegfried 6:00am, April 23, 2015
A little-known genius figured out where all the elements in the periodic table should be placed long before some of them were discovered.
Subscribe to RSS - Context