Genius: The Life and Science of Richard Feynman
What a strange and bewildering literature grew up around the term genius—defining it, analyzing it, categorizing it, rationalizing and reifying it. Commentators have contrasted it with such qualities as (mere) talent, intellect, imagination, originality, industriousness, sweep of mind and elegance of style; or have shown how genius is composed of these in various combinations. Psychologists and philosophers, musicologists and art critics, historians of science and scientists themselves have all stepped into this quagmire, a capacious one. Their several centuries of labor have produced no consensus on any of the necessary questions. Is there such a quality? If so, where does it come from? (A glial surplus in Brodmann area 39? A doting, faintly unsuccessful father who channels his intellectual ambition into his son? A frightful early encounter with the unknown, such as death of a sibling?) When otherwise sober scientists speak of the genius as magician, wizard, or superhuman, are they merely indulging in a flight of literary fancy? When people speak of the borderline between genius and madness, why is it so evident what they mean? And a question that has barely been asked (the where-are-the-.400-hitters question): Why, as the pool of available humans has risen from one hundred million to one billion to five billion, has the production of geniuses—Shakespeares, Newtons, Mozarts, Einsteins—seemingly choked off to nothing, genius itself coming to seem like the property of the past?
“Enlightened, penetrating, and capacious minds,” as William Duff chose to put it two hundred years ago, speaking of such exemplars as Homer, Quintilian, and Michelangelo in one of a string of influential essays by mid-eighteenth-century Englishmen that gave birth to the modern meaning of the word genius. Earlier, it had meant spirit, the magical spirit of a jinni or more often the spirit of a nation. Duff and his contemporaries wished to identify genius with the godlike power of invention, of creation, of making what never was before, and to do so they had to create a psychology of imagination: imagination with a “RAMBLING and VOLATILE power”; imagination “perpetually attempting to soar” and “apt to deviate into the mazes of error.”
Imagination is that faculty whereby the mind not only reflects on its own operations, but which assembles the various ideas conveyed to the understanding by the canal of sensation, and treasured up in the repository of the memory, compounding or disjoining them at pleasure; and which, by its plastic power of inventing new associations of ideas, and of combining them with infinite variety, is enabled to present a creation of its own, and to exhibit scenes and objects which never existed in nature.
These were qualities that remained two centuries later at the center of cognitive scientists’ efforts to understand creativity: the mind’s capacity for self-reflection, self-reference, self-comprehension; the dynamical and fluid creation of concepts and associations. The early essayists on genius, writing with a proper earnestness, attempting to reduce and regularize a phenomenon with (they admitted) an odor of the inexplicable, nevertheless saw that genius allowed a certain recklessness, even a lack of craftsmanship. Genius seemed natural, unlearned, uncultivated. Shakespeare was—“in point of genius,” Alexander Gerard wrote in 1774—Milton’s superior, despite a “defective” handling of poetic details. The torrent of analyses and polemics on genius that appeared in those years introduced a rhetoric of ranking and comparing that became a standard method of the literature. Homer versus Virgil, Milton versus Virgil, Shakespeare versus Milton. The results—a sort of tennis ladder for the genius league—did not always wear well with the passage of time. Newton versus Bacon? In Gerard’s view Newton’s discoveries amounted to a filling in of a framework developed with more profound originality by Bacon—“who, without any assistance, sketched out the whole design.” Still, there were those bits of Newtonian mathematics to consider. On reflection Gerard chose to leave for posterity “a question of very difficult solution, which of the two had the greatest genius.”
He and his contemporary essayists had a purpose. By understanding genius, rationalizing it, celebrating it, and teasing out its mechanisms, perhaps they could make the process of discovery and invention less accidental. In later times that motivation has not disappeared. More overtly than ever, the nature of genius—genius as the engine of scientific discovery—has become an issue bound up with the economic fortunes of nations. Amid the vast modern network of universities, corporate laboratories, and national science foundations has arisen an awareness that the best financed and best organized of research enterprises have not learned to engender, perhaps not even to recognize, world-turning originality.
Genius, Gerard summed up in 1774, “is confessed to be a subject of capital importance, without the knowledge of which a regular method of invention cannot be established, and useful discoveries must continue to be made, as they have generally been made hitherto, merely by chance.” Hitherto, as well. In our time he continues to be echoed by historians of science frustrated by the sheer ineffability of it all. But they keep trying to replace awe with understanding. J. D. Bernal said in 1939:
It is one of the hopes of the science of science that, by careful analysis of past discovery, we shall find a way of separating the effects of good organization from those of pure luck, and enabling us to operate on calculated risks rather than blind chance.
Yet how could anyone rationalize a quality as fleeting and accident-prone as a genius’s inspiration: Archimedes and his bath, Newton and his apple? People love stories about geniuses as alien heroes, possessing a quality beyond human understanding, and scientists may be the world’s happiest consumers of such stories. A modern example:
A physicist studying quantum field theory with Murray Gell-Mann at the California Institute of Technology in the 1950s, before standard texts have become available, discovers unpublished lecture notes by Richard Feynman, circulating samizdat style. He asks Gell-Mann about them. Gell-Mann says no, Dick’s methods are not the same as the methods used here. The student asks, well, what are Feynman’s methods? Gell-Mann leans coyly against the blackboard and says, Dick’s method is this. You write down the problem. You think very hard. (He shuts his eyes and presses his knuckles parodically to his forehead.) Then you write down the answer.
The same story appeared over and over again. It was an old genre. From an 1851 tract titled Genius and Industry:
(A professor from the University of Cambridge calls upon a genius of mathematics working in Manchester as a lowly clerk.) “… from Geometry to Logarithms, and to the Differential and Integral Calculus; and thence again to questions the most foreign and profound: at last, a question was proposed to the poor clerk—a question which weeks had been required to solve. Upon a simple slip of paper it was answered immediately. ‘But how,’ said the Professor, ‘do you work this? show me the rule! … The answer is correct but you have reached it by a different way.’
“‘I have worked it,’ said the clerk, ‘from a rule in my own mind. I cannot show you the law—I never saw it myself; the law is in my mind.’
“‘Ah!’ said the Professor, ‘if you talk of a law within your mind, I have done; I cannot follow you there.’”
Magicians again. As Mark Kac said: “… The working of their minds is for all intents and purposes incomprehensible. Even after we understand what they have done, the process by which they have done it is completely dark.” The notion places a few individuals at the margin of their community—the impractical margin, since the stock in trade of the scientist is the method that can be transferred from one practitioner to the next.
If the most distinguished physicists and mathematicians believe in the genius as magician, it is partly for psychological protection. A merely excellent scientist could suffer an unpleasant shock when he discussed his work with Feynman. It happened again and again: physicists would wait for an opportunity to get Feynman’s judgment of a result on which they had staked weeks or months of their career. Typically Feynman would refuse to allow them to give a full explanation. He said it spoiled his fun. He would let them describe just the outline of the problem before he would
jump up and say, Oh, I know that … and scrawl on the blackboard not his visitor’s result, A, but a harder, more general theorem, X. So A (about to be mailed, perhaps, to the Physical Review) was merely a special case. This could cause pain. Sometimes it was not clear whether Feynman’s lightning answers came from instantaneous calculation or from a storehouse of previously worked-out—and unpublished—knowledge. The astrophysicist Willy Fowler proposed at a Caltech seminar in the 1960s that quasars—mysterious blazing radiation sources lately discovered in the distant sky—were supermassive stars, and Feynman immediately rose, astonishingly, to say that such objects would be gravitationally unstable. Furthermore, he said that the instability followed from general relativity. The claim required a calculation of the subtle countervailing effects of stellar forces and relativistic gravity. Fowler thought he was talking through his hat. A colleague later discovered that Feynman had done a hundred pages of work on the problem years before. The Chicago astrophysicist Subrahmanyan Chandrasekhar independently produced Feynman’s result—it was part of the work for which he won a Nobel Prize twenty years later. Feynman himself never bothered to publish. Someone with a new idea always risked finding, as one colleague said, “that Feynman had signed the guest book and already left.”
A great physicist who accumulated knowledge without taking the trouble to publish could be a genuine danger to his colleagues. At best it was unnerving to learn that one’s potentially career-advancing discovery had been, to Feynman, below the threshold of publishability. At worst it undermined one’s confidence in the landscape of the known and not known. There was an uneasy subtext to the genus of story prompted by this habit. It was said of Lars Onsager, for example, that a visitor would ask him about a new result; sitting in his office chair he would say, I believe that is correct; then he would bend forward diffidently to open a file drawer, glance sidelong at a long-buried page of notes, and say, Yes, I thought so; that is correct. This was not always precisely what the visitor had hoped to hear.
A person with a mysterious storehouse of unwritten knowledge was a wizard. So was a person with the power to tease from nature its hidden secrets—a scientist, that is. The modern scientist’s view of his quest harkened back to something ancient and cabalistic: laws, rules, symmetries hidden just beneath the visible surface. Sometimes this view of the search for knowledge became overwhelming, even oppressive. John Maynard Keynes, facing a small audience in a darkened room at Cambridge a few years before his death, spoke of Newton as “this strange spirit, who was tempted by the Devil to believe … that he could reach all the secrets of God and Nature by the pure power of mind—Copernicus and Faustus in one.”
Why do I call him a magician? Because he looked on the whole universe and all that is in it as a riddle, as a secret which could be read by applying pure thought to certain evidence, certain mystic clues which God had laid about the world to allow a sort of philosopher’s treasure hunt to this esoteric brotherhood… . He did read the riddle of the heavens. And he believed that by the same powers of his introspective imagination he would read the riddle of the Godhead, the riddle of past and future events divinely foreordained, the riddle of the elements and their constitution… .
In his audience, intently absorbing these words, aware of the cold and the gloom and the seeming exhaustion of the speaker, was the young Freeman Dyson. Dyson came to accept much of Keynes’s view of genius, winnowing away the seeming mysticism. He made the case for magicians in the calmest, most rational way. No “magical mumbo-jumbo,” he wrote. “I am suggesting that anyone who is transcendentally great as a scientist is likely also to have personal qualities that ordinary people would consider in some sense superhuman.” The greatest scientists are deliverers and destroyers, he said. Those are myths, of course, but myths are part of the reality of the scientific enterprise.
When Keynes, in that Cambridge gloom, described Newton as a wizard, he was actually pressing back to a moderate view of genius—for after the eighteenth century’s sober tracts had come a wild turning. Where the first writers on genius had noticed in Homer and Shakespeare a forgivable disregard for the niceties of prosody, the romantics of the late nineteenth century saw powerful, liberating heroes, throwing off shackles, defying God and convention. They also saw a bent of mind that could turn fully pathological. Genius was linked with insanity—was insanity. That feeling of divine inspiration, the breath of revelation seemingly from without, actually came from within, where melancholy and madness twisted the brain. The roots of this idea were old. “Oh! how near are genius and madness!” Denis Diderot had written. “… Men imprison them and chain them, or raise statues to them.” It was a side effect of the change in focus from God-centeredness to human-centeredness. The very notion of revelation, in the absence of a Revealer, became disturbing, particularly to those who experienced it: “… something profoundly convulsive and disturbing suddenly becomes visible and audible with indescribable definiteness and exactness,” Friedrich Nietzsche wrote. “One hears—one does not seek; one takes—one does not ask who gives: a thought flashes out like lightning… .” Genius now suggested Charles-Pierre Baudelaire or Ludwig van Beethoven, flying off the tracks of normality. Crooked roads, William Blake had said: “Improvement makes strait roads; but the crooked roads without Improvement are roads of Genius.”
An 1891 treatise on genius by Cesare Lombroso listed some associated symptoms. Degeneration. Rickets. Pallor. Emaciation. Left-handedness. A sense of the mind as a cauldron in tumult was emerging in European culture, along with an often contradictory hodgepodge of psychic terminology, all awaiting the genius of Freud to provide a structure and a coherent jargon. In the meantime: Misoneism. Vagabondage. Unconsciousness. More presumed clues to genius. Hyperesthesia. Amnesia. Originality. Fondness for special words. “Between the physiology of the man of genius, therefore, and the pathology of the insane,” Lombroso concluded, “there are many points of coincidence… .” The genius, disturbed as he is, makes errors and wrong turns that the ordinary person avoids. Still, these madmen, “despising and overcoming obstacles which would have dismayed the cool and deliberate mind—hasten by whole centuries the unfolding of truth.”
The notion never vanished; in fact it softened into a cliché. Geniuses display an undeniable obsessiveness resembling, at times, monomania. Geniuses of certain kinds—mathematicians, chess players, computer programmers—seem, if not mad, at least lacking in the social skills most easily identified with sanity. Nevertheless, the lunatic-genius-wizard did not play as well in America, notwithstanding the relatively unbuttoned examples of writers like Whitman and Melville. There was a reason. American genius as the nineteenth century neared its end was not busy making culture, playing with words, creating music and art, or otherwise impressing the academy. It was busy sending its output to the patent office. Alexander Graham Bell was a genius. Eli Whitney and Samuel Morse were geniuses. Let European romantics celebrate the genius as erotic hero (Don Juan) or the genius as martyr (Werther). Let them bend their definitions to accommodate the genius composers who succeeded Mozart, with their increasingly direct pipelines to the emotions. In America what newspapers already called the machine age was under way. The consummate genius, the genius who defined the word for the next generation, was Thomas Alva Edison.
By his own description he was no wizard, this Wizard of Menlo Park. Anyone who knew anything about Edison knew that his genius was ninety-nine percent perspiration. The stories that defined his style were not about inspiration in the mode of the Newtonian apple. They spoke of exhaustive, laborious trial and error: every conceivable lamp filament, from human hair to bamboo fiber. “I speak without exaggeration,” Edison declared (certainly exaggerating), “when I say that I have constructed three thousand different theories in connection with the electric light, each one of them reasonable and apparently likely to be true.” He added that he had methodically disproved 2,998 of them by experiment. He claimed to have carried out fifty thousand individual experiments on a particular t
ype of battery. He had a classic American education: three months in a Michigan public school. The essential creativity that led him to the phonograph, the electric light, and more than a thousand other patented inventions was deliberately played down by those who built and those who absorbed his legend. Perhaps understandably so—for after centuries in which a rationalizing science had systematically drained magic from the world, the machine-shop inventions of Edison and other heroes were now loosing a magic with a frightening, transforming power. This magic buried itself in the walls of houses or beamed itself invisibly through the air.
“Mr. Edison is not a wizard,” reported a 1917 biography.
Like all people who have prodigiously assisted civilization, his processes are clear, logical and normal.
Wizardry is the expression of superhuman gifts and, as such, is an impossible thing… .
And yet, Mr. Edison can bid the voices of the dead to speak, and command men in their tombs to pass before our eyes.
“Edison was not a wizard,” announced a 1933 magazine article. “If he had what seems suspiciously like a magic touch, it was because he was markedly in harmony with his environment… .” And there the explication of Edisonian genius came more or less to an end. All that remained was to ask—but few did—one of those impossible late-night what if questions: What if Edison had never lived? What if this self-schooled, indefatigable mind with its knack for conceiving images of new devices, methods, processes had not been there when the flood began to break? The question answers itself, for it was a flood that Edison rode. Electricity had burst upon a world nearing the limits of merely mechanical ingenuity. The ability to understand and control currents of electrons had suddenly made possible a vast taxonomy of new machines—telegraphs, dynamos, lights, telephones, motors, heaters, devices to sew, grind, saw, toast, iron, and suck up dirt, all waiting at the misty edge of potentiality. No sooner had Hans Christian Oersted noticed, in 1820, that a current could move a compass needle than inventors—not just Samuel Morse but André-Marie Ampère and a half-dozen others—conceived of telegraphy. Even more people invented generators, and by the time enough pieces of technology had accumulated to make television possible, no one inventor could plausibly serve as its Edison.