Life in the real world is complicated. It’s much simpler on the computer. As the Game of Life begins, the screen is filled with a vast latticework of squares, only a few of them filled in. The magic is in the algorithms, which determine, on the basis of the current pattern, what will happen next. As time ticks on, whether any given square will be vacant or occupied, dead or alive, depends on its present state, as well as the states of its nearest neighbors, and possibly of their neighbors, and of their neighbors twice or three times removed. Change the first pattern, rewrite, delete, or add an algorithmic rule, and the pattern may grow unbounded and crenellated or recede to a tiny, moving archipelago, or evolve only to cycle back to its initial configuration, so that it can start over again. The wheel spins on.
The “moral” that excited everyone when this computer game was invented, in 1970, was that simplicity could beget complexity. It provided users with a computational version of the molecular primordial soup, with its ramifications intact but time collapsed. These days, of course, it’s hard not to see our real lives in the game’s simple, abstract terms. Variations on Life are used to model epidemics like the one we’re experiencing. The coronavirus is among the simplest life forms, living only to reproduce, occupying each host only to find the next, through some form of basic contact transmission. We are the squares, some of us occupied and some vacant, all of us doing what we can to avoid being a nearest, or even second-nearest, neighbor.
The inventor of Game of Life, John Conway, is among those we’ve lost to the coronavirus this year. (He died, in April, of COVID-19.) I sometimes wonder if, given his ironic and dark sense of humor, he would’ve appreciated the symmetry. He was an extraordinarily creative mathematician, one who needed to see a problem as a puzzle or a game in order for it to seize his interest.
Early on, Conway made his name by solving a complicated puzzle about symmetries in twenty-four dimensions. For most of us, it’s easier to start with two. Imagine that you have a pile of identical Frisbees. You want to lay as many of them down on the floor as possible—no stacking allowed! Try it, and you’ll probably find quite quickly that setting them out in rows that are shifted just a bit, so that the boundary of one Frisbee dips into the cleavage between the two below, is the best that you can do. Nestle a few rows together in this way, and you’ll find that any given Frisbee is surrounded by six others. At this point, a mathematician might place an imaginary peg in the center of a group of six Frisbees, then connect the pegs with imaginary lines. Do this, and you get a perfect hexagon, a shape that’s symmetrical in a number of ways: you can flip it across various axes or rotate it around its center in steps of sixty degrees, and, except for which corners are where, it remains unchanged.
In mathematical lingo, we’ve taken a few interesting steps. We started with a “packing problem”; by solving it, we uncovered a symmetrical shape; that shape, in turn, contains its own “group” of symmetries. We could take the same steps in three dimensions. Suppose that you replace the Frisbees with perfectly spherical, identical oranges. (A mathematician’s oranges are always perfect spheres.) Now we’re contemplating the so-called greengrocer’s problem—the question of the best way to stack fruit in a market. In the usual greengrocer’s arrangement, layers of oranges are stacked such that each orange touches twelve others. The three-dimensional polyhedron created when we connect the centers of the neighboring oranges also has its own, much larger, group of symmetries.
The mathematics of symmetry is called “group theory,” and its modern origins are generally traced back to the nineteenth-century mathematician Evariste Galois, who—in one of the most romantic of mathematical legends—is said to have feverishly organized his definitive manuscripts the night before a duel in which he died. Galois wasn’t interested in symmetries in space, but in symmetries among and within solutions to equations. A given solution to an equation might have a mirror solution, differing only in its sign: √2 and -√2, for instance. Galois realized that the complexity involved in solving an equation was intimately related to the complexity of the “group” of its solutions’ symmetries. His discovery initiated over a century’s worth of work aimed at ferreting out groups of symmetries hidden in ever-more-complicated mathematical and geometric structures. By the late nineteen-sixties, mathematicians were racing to fill out a complete catalog.
Conway, who had been hunting around for a good problem, had followed the work of the British mathematician John Leech, who had explored the packing of spheres in twenty-four-dimensional space. Leech had found that, in this fantastical grocery store, each sphere simultaneously touches 196,560 others. But the complete symmetries of the gemlike object obtained by connecting the centers of those neighbors were still unknown, and Conway decided to take a crack at finding them. He set out a deliberate schedule of work, anticipating that it would go on for weeks, but then blazed to a solution in a single, Galois-worthy night of intellectual frenzy: he found that his twenty-four-dimensional crystal was symmetrical in 8,315,553,613,086,720,000 distinct ways. This work would ultimately find applications in the creation of codes useful for communication between satellites and Earth. The so-called Conway Group, meanwhile, paved the way for the uncovering of an even larger group, which mathematicians call the Monster—a group of symmetries as large as the number of kilograms of matter in the observable universe, which lives in a space of over a hundred and ninety thousand dimensions. The Monster has helped mathematicians to understand prime numbers, and given physicists new insights into quantum gravity.
Conway was also a showman and a showoff and an intellectual competitor. A favorite parlor trick of his was to tell you the day of the week on any date, something he could do faster than anyone else. At Princeton, he could usually be found not in his office—which resembled a mathematical apothecary shop hit by a tornado—but in the large and somewhat soulless common room of Fine Hall, the massive looming tower, on the edge of the Princeton campus, that is the home of the mathematics department. When I was an undergraduate math major at Princeton, in the early nineteen-eighties, the common room would come to life only in the mid to late afternoon, just as things were revving up for the daily “tea,” a small box-cookie reception roughly marking the time when most classes had ended and a few seminars were about to start. Conway would often hold court there, hard to miss, a cross between Rasputin and a Middle Ages minstrel, loudly talking philosophy and mathematics, playing the board game Go, or engaging in some other kind of mathematical competition, surrounded by adoring and admiring students, faculty, and visitors. I was a shy and unhappy undergraduate, not a game player, and I would watch the small whirlwind of activity from a distance, eat my cookies, drink the terrible coffee, and then disappear back to the bowels of the mathematics library to work on my problem sets.
After graduating from college, I don’t think I saw Conway again until five years later, in 1989. I was in the audience at a conference at M.I.T., where Conway gave a lecture, titled “Computers and Frivolity,” to a packed house. I remember little about the content—something about the way in which a spirit of curiosity and fun, mixed with a little computing, could be a pathway to some deep mathematics. What I do remember quite clearly was that Conway gave the talk using an overhead projector with a single transparency; each time he filled the transparency, he picked it up and then, to the horrified delight of the audience, licked it clean, then resumed writing. To Conway, mathematics was a game—so much so that, later in his career, he discovered, or invented, a new class of numbers that can be infinitely large and infinitely small: the “surreal numbers,” which include the real ones.
Conway had both a disciplined and an undisciplined mind. He was childlike in many ways, and he took advantage of the kind of leeway that we grant to geniuses. His method was to make mathematics out of whatever caught his fancy, but to do so with laser focus. You and I might notice that brickwork often has a pattern to it; only Conway could turn that into a deep exploration of symmetry. (After I heard him speak on this subject, walks through Central Park were never the same.) Some say that his contributions to mathematics peaked with his discovery of the Monster. That’s a little like saying that, after “Anna Karenina,” it was all downhill for Tolstoy; still, Conway himself often fretted about losing his mathematical powers. In the early two-thousands, Conway was among the mathematicians my co-producers and I interviewed for a documentary, “The Math Life.” We had a broad and fascinating conversation, ranging from deep mathematics to word origins (“numb” and “number” are very likely connected!) and wordplay. He reflected on a life of intellectual privilege, the joys of teaching, and the wild highs and dark lows of perpetual thinking. Our talk of his achievements was tinged with melancholy as he reminisced about the “white hot” creative moments now in his rearview mirror. While Conway’s life was full of honors and mathematical achievement, it was also just a life, and a complicated one, one as messy as his office: several marriages, bouts of depression, and even a suicide attempt. Mathematics was both a passion and an escape. “You know the saying ‘Euclid alone has looked on beauty bare’?” he asked me. (It’s a line from Edna St. Vincent Millay.) “Well, what does that mean? I think it means that, you know, in Euclidean geometry, because it’s stripped—stripped of cats and twigs and palaver—there’s just something pure, and clean, and simple, and exact, and precise.”
In Conway’s Game of Life, chaos emerges from order. In reality, it’s generally the other way around: life is lived forward and understood backward, which is to say that the search for order in randomness is a very human endeavor. In that sense, we’re all mathematicians—pattern seekers and pattern creators at heart, from the search for meaning in our terrestrial wanderings to the imposition of constellational structures as we scan the night sky. In the vast heavens over a pitch-black town in ancient Greece, how could your eye not connect a line of stars into Orion’s Belt, or a group of them into the Big and Little Dippers? There are mathematical manifestations of this impulse. The game of connect-the-dots that our forebears played with the stars is a precursor to a zone of inquiry that mathematicians call Ramsey Theory—named for the British mathematician Frank Ramsey—which explores the inevitability of finding preordained structures in collections of random dots. Essentially, it investigates the conditions under which structure is unavoidable.
Social events can sometimes feel like settings populated by random people (or random points, to a mathematician). Here’s a question: how many people do you need to invite to a party to guarantee that three of them will be either mutual friends or mutual strangers? You might visualize a gathering as a network, with lines connecting friends or strangers; in either case, their constellations will form triangles. The “Ramsey number” associated with this scenario tells us the minimum gathering size we need in order for those triangles to emerge. In this case, five people is too few, but six will do the trick—so the Ramsey number for this odd social-engineering task is six.
For more complex scenarios, Ramsey numbers are notoriously difficult to calculate. They seem to require the listing out, for each guest, of the other guests they do and don’t know—an enumeration that quickly becomes an unmanageable task. Instead of making lists, mathematicians have tended to reframe the question in terms of an upper bound: we might conclude that the Ramsey number, whatever it is, is no higher than a certain other number. Finding these bounds can quickly take us into the numerical stratosphere. It was through such a quest that Ron Graham, who also died this year, arrived at Graham’s number, once called “the largest number ever to have a use.”
Graham’s early life was one of great peregrination in which a love of mathematics was a steady and portable source of comfort. A few years of school here and a few there led to entry into the University of Chicago at fifteen, through a program for precocious teen-agers; he studied philosophy and literature in the school’s “great books” program—Carl Sagan was a classmate—then left, a few credits shy of a degree, to study mathematics at Berkeley. He left Berkeley early, too, to enlist in the Air Force—“The brochures looked great!” he told me, when I interviewed him for “The Math Life”—and, while stationed in Fairbanks, Alaska, earned a degree in physics as a part-time student. Later, he would return to Berkeley to finish his doctorate in math, becoming one of the great “combinatorialists” of our time. Many parlor-room questions are combinatorial: “How many ways can we seat these people at this table so that no one is sitting next to someone she knows?” But there are less familiar questions, and the beautiful formulas that answer them inform probability theory and computer science.
Graham’s number mixes party planning with geometry. Imagine a party held on a jungle gym with eight guests; each guest sits on a corner of a cube. By slicing the cube through any two parallel edges, it’s possible to isolate a four-person “table”—a plane on which four guests sit. Six of these “four-tops” are made by the sides of the cube; six more are made by diagonal slices through it. You might ask yourself whether, at such a party, you’re guaranteed to find that the guests at any of these four-tops will either all know one another or all be strangers. The answer is no: with eight guests at a cubical party, such a social arrangement isn’t guaranteed.