Things Have History

ai

Boole's algebra of thought

Listen

On a wet November afternoon in 1864, George Boole walked two miles from his home in Blackrock to deliver a lecture at Queen’s College, Cork. He arrived soaking, taught the class anyway, and came home feverish. His wife Mary, who believed that cures should resemble their causes, responded to his pneumonia by wrapping him in wet sheets and pouring buckets of cold water over him. He died nine days later. He was 49. The irony is peculiar: the man who had spent his working life arguing that reason operates by fixed, mechanical rules died in part because someone applied a rule too faithfully.

The Booles lived in Cork because in 1849, Queen’s College had done something unusual: they appointed a provincial schoolmaster — no university degree, no title, no institutional pedigree — to be their first Professor of Mathematics. He had earned the post on the strength of two journal papers and the growing conviction among people like Augustus De Morgan that Boole was simply the most interesting mathematician working in England. He was 34. His father had been a cobbler in Lincoln; Boole himself had started teaching at 16, when the family ran out of money.

Five years after arriving in Cork, Boole published the work he had been building toward since his twenties: An Investigation of the Laws of Thought, 1854. The title sounds grandiose. The book earns it. Boole’s argument was this: deductive thought is a set of operations on classes — things that either belong to a set or do not — and those operations obey algebraic rules. Write x for white things, y for sheep, and xy means white sheep. Write 1 − x for everything that is not white. The symbols add, multiply, and cancel like ordinary algebra. What Aristotle had catalogued as nineteen valid syllogism forms, Boole collapsed into a single rule-set capable of expressing any logical argument, however tangled.

The larger claim was less about mathematics than about minds. Traditional logic said: here are the valid inference patterns — memorize them. Boole said: here is one algebra — apply it, and correct reasoning follows automatically. He was not describing thought as something mysterious or intuitive. He was describing it as a procedure.

His sister preserved a memory from his boyhood: Boole had always believed that logic could be made mathematical. When the method crystallized for him in 1847, she wrote, it hit him “literally like a man dazzled with excess of light” — so urgently that he published his first pamphlet, The Mathematical Analysis of Logic, in haste, almost accidentally, to settle a dispute with Sir William Hamilton. The 1854 book was the one he had always meant to write. He called it, in a letter, “the most valuable contribution I have made” (MacTutor History of Mathematics).

The payoff arrived seventy years after the dazzle. In 1937, a 21-year-old MIT graduate student named Claude Shannon noticed that Boole’s two values — 0 and 1, false and true — mapped exactly onto the open and closed states of an electrical relay. Every circuit in every computer ever built since runs on Boolean operations: AND, OR, NOT. The algebra Boole wrote to describe the human mind became the language in which machines were built to simulate it.

What Boole called the laws of thought, we now call a processor.

Sources