Science past and present

Tom Siegfried



Happy Birthday to Boole, with 11001000 binary candles

The widespread use of the binary digits 1 and 0 in computer information processing is one of the legacies of investigations into the mathematical representation of logic by the English mathematician George Boole, who was born 200 years ago November 2.

Sponsor Message

It’s almost time to celebrate George Boole’s 200th birthday. Or maybe we should call it Birthday No. 11001000.

You might have a hard time converting that binary number to decimal in your head. But it would be a snap for computers, which store numbers and manipulate them exclusively as 1s and 0s. Of course, everybody knows that. But not so many know that you have Boole to blame. Or thank.

Boole was the mathematician with a vision for transforming logical thinking into mathematical formulas. His book An Investigation of the Laws of Thought, published in 1854, established him as one of the most creative mathematical thinkers of his century. The mathematician-philosopher Bertrand Russell, in fact, considered Boole’s book to be “the work in which pure mathematics was discovered.”

Boole started young. Born on November 2, 1815, in Lincoln, England, he grew up without much education but an abundance of intelligence. During his youth he spent a lot of time in libraries, mastering everything he could about math. Before he turned 20 he was widely known in Lincolnshire as an expert on Newton. Soon Boole began publishing papers in math journals; at age 34, he got the job as math professor at Queen’s College, Cork in Ireland, even though he had never earned a degree in math, or in anything else.

Boole’s most creative idea was to express true and false propositions by numbers, and then crunch the numbers algebraically to distinguish correct from incorrect deductions. In his famous book he worked out the details of his ambition to merge logic with mathematics.

Boole’s book is not light reading. He explains his reasoning in rather tedious detail. But that reasoning was powerful enough to perceive the role that binary numbers — numbers expressed using only the two digits 0 and 1 — could play in representing human thought algebraically.

He began with the notion that various concepts could be represented mathematically by algebraic symbols. For instance, x could represent “men.” Then y could represent “all white things.” To refer to all nonwhite men, then, you could use the expression x minus y.

Various expressions representing other concepts could be mathematically combined to reach irrefutable logical conclusions, Boole believed.

He expended considerable intellectual effort to formulate the rules for making such computations. One rule, for instance, designated the meaning of multiplication: multiplying two symbols yielded all the members of the group that met the definition for both of the symbols.

So assigning x to all men and y to white things implies that x times y equals all white men. That seems reasonable enough. But Boole was astute enough to realize that his multiplication rule could pose problems. If you multiplied x times x, for instance — all men times all men — the answer would be “all men.” In other words, x times x, or x squared, equaled x. At first glance, that appears to be a logical inconsistency, kind of unsatisfactory when your whole plan was to implement a foolproof mathematical plan for doing logic. But Boole thought more deeply and saw that sometimes, x squared does equal x. And in fact, x squared always equals x for two numbers: 1 and 0. Voilà! Computing logical relationships algebraically — the sort of things computers do today — works just fine if you use binary numbers, 1 and 0, in the calculations.

Boole developed from this insight the system of “Boolean logic” that (with the embellishments of many mathematicians who followed him) underlies the logic gates used in modern computer chips, relying on the “truth values” of 1 for true and 0 for false.

Besides his work on logic, Boole wrote extensively on the (not unrelated) mathematics of probability. He eventually wanted to combine his insights into a more fully developed philosophy based on a deep understanding of how math and logic captured human thought. 

Boole’s most ambitious goals were never completely realized, of course. Computers now are sometimes portrayed as “thinking machines” that embody Boole’s logical principles. But even today’s best computers can’t outthink the best humans (although computers are faster, and certainly beat the hell out of the worst humans).

Besides that, there is the problem that logic, too rigidly applied, can lead to tragedy. Boole, for instance, died young, at age 49, after walking two miles through a rainstorm and then lecturing while still all wet. That was followed by pneumonia. Perhaps the bedridden Boole might have survived. But his wife, applying the logical principle that a remedy resembles its cause, doused his bed with bucketfuls of water. (At least that’s the legend.) Consequently, Boole did not live to see 110010.

Follow me on Twitter: @tom_siegfried

History of Science

Top 10 science anniversaries of 2015

By Tom Siegfried 12:02pm, April 8, 2015
From genes and dreams to gravity and Kevlar, 2015 offers plenty to celebrate.

P value ban: small step for a journal, giant leap for science

By Tom Siegfried 3:18pm, March 17, 2015
Peer-reviewed journals have largely insisted on P values as a standard of worthiness. But now the editors of one journal have banned the statistical tool.
Evolution,, Numbers

Life’s origin might illustrate the power of game theory

By Tom Siegfried 4:25pm, March 9, 2015
Game theory math can describe molecular competition and cooperation, perhaps providing clues to the origin of life.
History of Science

Islamic science paved the way for a millennial celebration of light

By Tom Siegfried 6:06pm, February 24, 2015
Ibn al-Haytham’s book on optics from a millennium ago serves as a good excuse to celebrate the International Year of Light.

Top 10 messages to send to E.T.

By Tom Siegfried 4:29pm, February 13, 2015
Fears that sending signals to alien civilizations would provoke an invasion shouldn't prevent transmitting important messages.
Quantum Physics,, Numbers,, Cosmology

Top 10 scientific mysteries for the 21st century

By Tom Siegfried 8:00am, January 28, 2015
Solving the Top 10 scientific mysteries facing the 21st century will not be all fun but could be mostly games.
Quantum Physics

Physicists debate whether quantum math is as real as atoms

By Tom Siegfried 1:30pm, January 15, 2015
Physicists debate whether quantum states are as real as atoms or are just tools for forecasting phenomena.
Quantum Physics

Bell’s math showed that quantum weirdness rang true

By Tom Siegfried 8:00am, December 29, 2014
50 years ago, John Bell proved a theorem that led the way to establishing the weirdness of quantum physics.
History of Science,, Science & Society

The medieval mentality of modern science

By Tom Siegfried 9:00am, November 21, 2014
Today’s scientists grapple with many of the same issues that stumped their medieval predecessors.
History of Science,, Science & Society

Top 10 science popularizers of all time

By Tom Siegfried 9:30am, November 7, 2014
Since antiquity, some notable thinkers have served society by translating science into popular form.
Subscribe to RSS - Context