Sapiens and Homo Deus: The E-Book Collection
Mere observations, however, are not knowledge. In order to understand the universe, we need to connect observations into comprehensive theories. Earlier traditions usually formulated their theories in terms of stories. Modern science uses mathematics.
There are very few equations, graphs and calculations in the Bible, the Qur’an, the Vedas or the Confucian classics. When traditional mythologies and scriptures laid down general laws, these were presented in narrative rather than mathematical form. Thus a fundamental principle of Manichaean religion asserted that the world is a battleground between good and evil. An evil force created matter, while a good force created spirit. Humans are caught between these two forces, and should choose good over evil. Yet the prophet Mani made no attempt to offer a mathematical formula that could be used to predict human choices by quantifying the respective strength of these two forces. He never calculated that ‘the force acting on a man is equal to the acceleration of his spirit divided by the mass of his body’.
This is exactly what scientists seek to accomplish. In 1687, Isaac Newton published The Mathematical Principles of Natural Philosophy, arguably the most important book in modern history. Newton presented a general theory of movement and change. The greatness of Newton’s theory was its ability to explain and predict the movements of all bodies in the universe, from falling apples to shooting stars, using three very simple mathematical laws:
Henceforth, anyone who wished to understand and predict the movement of a cannonball or a planet simply had to make measurements of the object’s mass, direction and acceleration, and the forces acting on it. By inserting these numbers into Newton’s equations, the future position of the object could be predicted. It worked like magic. Only around the end of the nineteenth century did scientists come across a few observations that did not fit well with Newton’s laws, and these led to the next revolutions in physics – the theory of relativity and quantum mechanics.
Newton showed that the book of nature is written in the language of mathematics. Some chapters (for example) boil down to a clear-cut equation; but scholars who attempted to reduce biology, economics and psychology to neat Newtonian equations have discovered that these fields have a level of complexity that makes such an aspiration futile. This did not mean, however, that they gave up on mathematics. A new branch of mathematics was developed over the last 200 years to deal with the more complex aspects of reality: statistics.
In 1744, two Presbyterian clergymen in Scotland, Alexander Webster and Robert Wallace, decided to set up a life-insurance fund that would provide pensions for the widows and orphans of dead clergymen. They proposed that each of their church’s ministers would pay a small portion of his income into the fund, which would invest the money. If a minister died, his widow would receive dividends on the fund’s profits. This would allow her to live comfortably for the rest of her life. But to determine how much the ministers had to pay in so that the fund would have enough money to live up to its obligations, Webster and Wallace had to be able to predict how many ministers would die each year, how many widows and orphans they would leave behind, and by how many years the widows would outlive their husbands.
Take note of what the two churchmen did not do. They did not pray to God to reveal the answer. Nor did they search for an answer in the Holy Scriptures or among the works of ancient theologians. Nor did they enter into an abstract philosophical disputation. Being Scots, they were practical types. So they contacted a professor of mathematics from the University of Edinburgh, Colin Maclaurin. The three of them collected data on the ages at which people died and used these to calculate how many ministers were likely to pass away in any given year.
Their work was founded on several recent breakthroughs in the fields of statistics and probability. One of these was Jacob Bernoulli’s Law of Large Numbers. Bernoulli had codified the principle that while it might be difficult to predict with certainty a single event, such as the death of a particular person, it was possible to predict with great accuracy the average outcome of many similar events. That is, while Maclaurin could not use maths to predict whether Webster and Wallace would die next year, he could, given enough data, tell Webster and Wallace how many Presbyterian ministers in Scotland would almost certainly die next year. Fortunately, they had ready-made data that they could use. Actuary tables published fifty years previously by Edmond Halley proved particularly useful. Halley had analysed records of 1,238 births and 1,174 deaths that he obtained from the city of Breslau, Germany. Halley’s tables made it possible to see that, for example, a twenty-year-old person has a 1:100 chance of dying in a given year, but a fifty-year-old person has a 1:39 chance.
Processing these numbers, Webster and Wallace concluded that, on average, there would be 930 living Scottish Presbyterian ministers at any given moment, and an average of twenty-seven ministers would die each year, eighteen of whom would be survived by widows. Five of those who did not leave widows would leave orphaned children, and two of those survived by widows would also be outlived by children from previous marriages who had not yet reached the age of sixteen. They further computed how much time was likely to go by before the widows’ death or remarriage (in both these eventualities, payment of the pension would cease). These figures enabled Webster and Wallace to determine how much money the ministers who joined their fund had to pay in order to provide for their loved ones. By contributing £2 12s. 2d. a year, a minister could guarantee that his widowed wife would receive at least £10 a year – a hefty sum in those days. If he thought that was not enough he could choose to pay in more, up to a level of £6 11s. 3d. a year – which would guarantee his widow the even more handsome sum of £25 a year.
According to their calculations, by the year 1765 the Fund for a Provision for the Widows and Children of the Ministers of the Church of Scotland would have capital totalling £58,348. Their calculations proved amazingly accurate. When that year arrived, the fund’s capital stood at £58,347 – just £1 less than the prediction! This was even better than the prophecies of Habakkuk, Jeremiah or St John. Today, Webster and Wallace’s fund, known simply as Scottish Widows, is one of the largest pension and insurance companies in the world. With assets worth £100 billion, it insures not only Scottish widows, but anyone willing to buy its policies.7
Probability calculations such as those used by the two Scottish ministers became the foundation not merely of actuarial science, which is central to the pension and insurance business, but also of the science of demography (founded by another clergyman, the Anglican Robert Malthus). Demography in its turn was the cornerstone on which Charles Darwin (who almost became an Anglican pastor) built his theory of evolution. While there are no equations that predict what kind of organism will evolve under a specific set of conditions, geneticists use probability calculations to compute the likelihood that a particular mutation will spread in a given population. Similar probabilistic models have become central to economics, sociology, psychology, political science and the other social and natural sciences. Even physics eventually supplemented Newton’s classical equations with the probability clouds of quantum mechanics.
We need merely look at the history of education to realise how far this process has taken us. Throughout most of history, mathematics was an esoteric field that even educated people rarely studied seriously. In medieval Europe, logic, grammar and rhetoric formed the educational core, while the teaching of mathematics seldom went beyond simple arithmetic and geometry. Nobody studied statistics. The undisputed monarch of all sciences was theology.
Today few students study rhetoric; logic is restricted to philosophy departments, and theology to seminaries. But more and more students are motivated – or forced – to study mathematics. There is an irresistible drift towards the exact sciences – defined as ‘exact’ by their use of mathematical tools. Even fields of study that were traditionally part of the humanities, such as the study of human language (linguistics) and the human psyche (psychology), rely increasingly on mathematics and seek to pr
esent themselves as exact sciences. Statistics courses are now part of the basic requirements not just in physics and biology, but also in psychology, sociology, economics and political science.
In the course catalogue of the psychology department at my own university, the first required course in the curriculum is ‘Introduction to Statistics and Methodology in Psychological Research’. Second-year psychology students must take ‘Statistical Methods in Psychological Research’. Confucius, Buddha, Jesus and Muhammad would have been bewildered if you told them that in order to understand the human mind and cure its illnesses you must first study statistics.
Knowledge is Power
Most people have a hard time digesting modern science because its mathematical language is difficult for our minds to grasp, and its findings often contradict common sense. Out of the 7 billion people in the world, how many really understand quantum mechanics, cell biology or macroeconomics? Science nevertheless enjoys immense prestige because of the new powers it gives us. Presidents and generals may not understand nuclear physics, but they have a good grasp of what nuclear bombs can do.
In 1620 Francis Bacon published a scientific manifesto titled The New Instrument. In it he argued that ‘knowledge is power’. The real test of ‘knowledge’ is not whether it is true, but whether it empowers us. Scientists usually assume that no theory is 100 per cent correct. Consequently, truth is a poor test for knowledge. The real test is utility. A theory that enables us to do new things constitutes knowledge.
Over the centuries, science has offered us many new tools. Some are mental tools, such as those used to predict death rates and economic growth. Even more important are technological tools. The connection forged between science and technology is so strong that today people tend to confuse the two. We often think that it is impossible to develop new technologies without scientific research, and that there is little point in research if it does not result in new technologies.
In fact, the relationship between science and technology is a very recent phenomenon. Prior to 1500, science and technology were totally separate fields. When Bacon connected the two in the early seventeenth century, it was a revolutionary idea. During the seventeenth and eighteenth centuries this relationship tightened, but the knot was tied only in the nineteenth century. Even in 1800, most rulers who wanted a strong army, and most business magnates who wanted a successful business, did not bother to finance research in physics, biology or economics.
I don’t mean to claim that there is no exception to this rule. A good historian can find precedent for everything. But an even better historian knows when these precedents are but curiosities that cloud the big picture. Generally speaking, most premodern rulers and business people did not finance research about the nature of the universe in order to develop new technologies, and most thinkers did not try to translate their findings into technological gadgets. Rulers financed educational institutions whose mandate was to spread traditional knowledge for the purpose of buttressing the existing order.
Here and there people did develop new technologies, but these were usually created by uneducated craftsmen using trial and error, not by scholars pursuing systematic scientific research. Cart manufacturers built the same carts from the same materials year in year out. They did not set aside a percentage of their annual profits in order to research and develop new cart models. Cart design occasionally improved, but it was usually thanks to the ingenuity of some local carpenter who never set foot in a university and did not even know how to read.
This was true of the public as well as the private sector. Whereas modern states call in their scientists to provide solutions in almost every area of national policy, from energy to health to waste disposal, ancient kingdoms seldom did so. The contrast between then and now is most pronounced in weaponry. When outgoing President Dwight Eisenhower warned in 1961 of the growing power of the military-industrial complex, he left out a part of the equation. He should have alerted his country to the military-industrial-scientific complex, because today’s wars are scientific productions. The world’s military forces initiate, fund and steer a large part of humanity’s scientific research and technological development.
When World War One bogged down into interminable trench warfare, both sides called in the scientists to break the deadlock and save the nation. The men in white answered the call, and out of the laboratories rolled a constant stream of new wonder-weapons: combat aircraft, poison gas, tanks, submarines and ever more efficient machine guns, artillery pieces, rifles and bombs.
33. German V-2 rocket ready to launch. It didn’t defeat the Allies, but it kept the Germans hoping for a technological miracle until the very last days of the war.
{© Ria Novosti/Science Photo Library.}
Science played an even larger role in World War Two. By late 1944 Germany was losing the war and defeat was imminent. A year earlier, the Germans’ allies, the Italians, had toppled Mussolini and surrendered to the Allies. But Germany kept fighting on, even though the British, American and Soviet armies were closing in. One reason German soldiers and civilians thought not all was lost was that they believed German scientists were about to turn the tide with so-called miracle weapons such as the V-2 rocket and jet-powered aircraft.
While the Germans were working on rockets and jets, the American Manhattan Project successfully developed atomic bombs. By the time the bomb was ready, in early August 1945, Germany had already surrendered, but Japan was fighting on. American forces were poised to invade its home islands. The Japanese vowed to resist the invasion and fight to the death, and there was every reason to believe that it was no idle threat. American generals told President Harry S. Truman that an invasion of Japan would cost the lives of a million American soldiers and would extend the war well into 1946. Truman decided to use the new bomb. Two weeks and two atom bombs later, Japan surrendered unconditionally and the war was over.
But science is not just about offensive weapons. It plays a major role in our defences as well. Today many Americans believe that the solution to terrorism is technological rather than political. Just give millions more to the nanotechnology industry, they believe, and the United States could send bionic spy-flies into every Afghan cave, Yemenite redoubt and North African encampment. Once that’s done, Osama Bin Laden’s heirs will not be able to make a cup of coffee without a CIA spy-fly passing this vital information back to headquarters in Langley. Allocate millions more to brain research, and every airport could be equipped with ultra-sophisticated FMRI scanners that could immediately recognise angry and hateful thoughts in people’s brains. Will it really work? Who knows. Is it wise to develop bionic flies and thought-reading scanners? Not necessarily. Be that as it may, as you read these lines, the US Department of Defense is transferring millions of dollars to nanotechnology and brain laboratories for work on these and other such ideas.
This obsession with military technology – from tanks to atom bombs to spy-flies – is a surprisingly recent phenomenon. Up until the nineteenth century, the vast majority of military revolutions were the product of organisational rather than technological changes. When alien civilisations met for the first time, technological gaps sometimes played an important role. But even in such cases, few thought of deliberately creating or enlarging such gaps. Most empires did not rise thanks to technological wizardry, and their rulers did not give much thought to technological improvement. The Arabs did not defeat the Sassanid Empire thanks to superior bows or swords, the Seljuks had no technological advantage over the Byzantines, and the Mongols did not conquer China with the help of some ingenious new weapon. In fact, in all these cases the vanquished enjoyed superior military and civilian technology.
The Roman army is a particularly good example. It was the best army of its day, yet technologically speaking, Rome had no edge over Carthage, Macedonia or the Seleucid Empire. Its advantage rested on efficient organisation, iron discipline and huge manpower reserves. The Roman army never set up a research and development department, an
d its weapons remained more or less the same for centuries on end. If the legions of Scipio Aemilianus – the general who levelled Carthage and defeated the Numantians in the second century BC – had suddenly popped up 500 years later in the age of Constantine the Great, Scipio would have had a fair chance of beating Constantine. Now imagine what would happen to a general from a few centuries back – say Napoleon – if he led his troops against a modern armoured brigade. Napoleon was a brilliant tactician, and his men were crack professionals, but their skills would be useless in the face of modern weaponry.
As in Rome, so also in ancient China: most generals and philosophers did not think it their duty to develop new weapons. The most important military invention in the history of China was gunpowder. Yet to the best of our knowledge, gunpowder was invented accidentally, by Daoist alchemists searching for the elixir of life. Gunpowder’s subsequent career is even more telling. One might have thought that the Daoist alchemists would have made China master of the world. In fact, the Chinese used the new compound mainly for firecrackers. Even as the Song Empire collapsed in the face of a Mongol invasion, no emperor set up a medieval Manhattan Project to save the empire by inventing a doomsday weapon. Only in the fifteenth century – about 600 years after the invention of gunpowder – did cannons become a decisive factor on Afro-Asian battlefields. Why did it take so long for the deadly potential of this substance to be put to military use? Because it appeared at a time when neither kings, scholars, nor merchants thought that new military technology could save them or make them rich.