The Great Way Is Simplicity — von Neumann’s The Computer and the Brain

The Great Way Is Simplicity — von Neumann’s The Computer and the Brain
The Great Way Is Simplicity — von Neumann’s The Computer and the Brain
John von Neumann is often called the father of the computer. That title is, of course, an honor of the highest order—especially now that computers have become indispensable tools, perhaps even indispensable tools for reaching the future. In that sense, the title carries even greater weight today. But it does not begin to encompass the full scope of von Neumann’s achievements. At his core, he was a mathematician, with major contributions to functional analysis, ergodic theory, geometry, topology, and numerical analysis, among many other fields. Beyond computing, he was also an important participant in the American atomic bomb and hydrogen bomb projects, one of the key founders of quantum mechanics, and one of the creators of game theory. It is no exaggeration to say that in several of the most important directions of modern science, von Neumann left his mark.
As a Jewish intellectual born in Hungary, von Neumann also benefited from a certain degree of historical luck in the course of his brilliant career. Before the outbreak of the Second World War, he was fortunate enough to conduct research in Germany, then a center of intense academic activity, and on the eve of the war he moved to the United States, which would come to represent the future. After the success of the atomic bomb project, he was one of only four scientists who supported the hydrogen bomb program, and I suspect that this may have had something to do with his Jewish identity. He also proposed the use of atomic weapons in a “preventive war” to stop the Soviet Union from developing its own bomb. That advocacy of war has often been criticized, yet the Soviet Union did in fact build an atomic bomb just a few years later, and for a long time afterward the world was held in the bipolar tension of the Cold War. Like many of his contemporaries who had lived through the First World War, he concluded as early as 1935 that another war was inevitable, and that France, despite the Maginot Line, would prove unable to withstand it. Because of the policy of appeasement, the war began a year later than he had predicted, but France did indeed perform disastrously in the Second World War. With that level of foresight, it is hardly surprising that after the war he was hired as an adviser by the U.S. government and by multiple think tanks.
Von Neumann carried many titles—mathematician, physicist, computer scientist, and more. Like most scientists, the bulk of his achievements are embodied in papers and practical results. And yet, when they occasionally write a short work not packed with formulas and proofs, it can be even more astonishing to the lay reader. This slim volume, The Computer and the Brain, makes one sigh in admiration at the truth that the great way is simplicity. Only a true master can use plain, unadorned language to express ideas that are at once profound and vivid. The book inevitably brings to mind Richard Feynman’s The Pleasure of Finding Things Out; the two seem to share a certain spirit.
As the title suggests, this book is primarily concerned with the similarities and differences between the computer and the human brain. Through analysis and comparison, von Neumann tries to deepen our understanding of how each of them works. The serial computer and the digital computer discussed in the book have since become the standard model of computer development today. But after finishing the book, one cannot help wondering: while quantum computing continues to advance, might there also be unexpected gains in revisiting the analog computer and the parallel computer that von Neumann mentioned? Unfortunately, the author does not go into much architectural or practical detail about these latter forms of computation. Perhaps that is one reason later generations have not known where to begin.
Scientific progress requires the imagination and creativity of genius, but no one can say when genius will appear. Today, expectations for the future often fall into two opposite camps: optimism and pessimism. Optimists argue that humanity’s achievements over the past two centuries surpass the accumulated gains of thousands of years of civilization, and that the progress of recent decades exceeds even that of the previous two hundred years. Human development, they believe, seems to follow something like Moore’s Law, doubling rapidly as computers do, and will soon lead toward a world of universal harmony. Pessimists, by contrast, argue that traditional scientific progress has depended largely on the sudden appearance of extraordinary geniuses, and the emergence and distribution of such genius is fundamentally random. In their view, the technological progress of recent decades is still benefiting from the dividends of the last great wave of scientific development—the foundational breakthroughs made by scientists of the generation of Einstein and von Neumann. But those dividends are now diminishing rapidly. Without support in energy, scientific research, and other key areas, the future development of human society is not merely uncertain, but full of danger.
Perhaps, at a moment like this—when the basic sciences have not seen major breakthroughs—it is still too early to make confident judgments about the future. And if genius does not arrive as we hope, then for the broader community of researchers, perhaps reading more of the works left behind by those earlier geniuses may help. From their brilliance and their unbounded imagination, one may yet draw a measure of inspiration.


