The Blind Giant: Being Human in a Digital World
Except that, in a way, it’s not an invasion at all. The hearth, once a very simple, solid thing with discrete boundaries, has been extended into the world. The telephone allows us to reach out; the television allows us to see out; the computer allows us to search, to send messages and so on. We have positioned these things within the compass of the private space, and extended its reach. At the same time, we have made it somewhat porous. We have extended our personal space into the digital, storing images and parts of our history, interacting online. We make common digital spaces with family members overseas, with friends in other locations. We even shop from our living rooms, allowing a limited amount of the commercial world to enter our homes. We have extended the hearth to meet other hearths, and to connect with the aspects of government, media and commerce which are designed to face the private space.
The benefits of the extension are profound; the openness they require is, if not a vacuum, at least an area of low pressure, into which capitalism and administration have naturally flowed. Much of what digital technology does, good and bad, is achieved by a kind of blurring of the lines. We have blurred the boundaries of our most important spaces, and done so deliberately if not knowingly. We now have to learn to control the tide, to push back against the inward pressure. The boundary between the world of relaxing baths, partners and children, dogs, and the hearth and the world of work has become blurred. Distraction shatters focus, and, once gone, it’s hard to re-create the fragile entity that is a mood of peace and tranquillity; but even more, to reconstruct the sense of private place and safety. Before ever there was Myspace – the first of the big social networks, which aptly recognized the importance of a bounded personal area online – there was the hearth space, and physical or not, outmoded or not, we need it.
In response to the external pressure, some people simply shut down or refuse to engage. A lawyer who worked in my wife’s office when she was starting out refused to hold any meetings by phone at all, and did everything either in person or by letter. These days I suspect that is simply impossible, unless you are sufficiently powerful in your own arena that you are able to define the rules, but how you would get to that point these days without using digital communications I’m not sure. All the same, many people of my parents’ generation own mobile phones but carry them switched off, getting them out only to make calls and immediately putting them back to sleep as soon as they have finished. My father in particular has a tendency to do this, often leaving me messages to which I therefore cannot hope to reply. I’ve likened this to the old children’s game of banging on someone’s door and then running away, but he remains Luddishly and somewhat joyfully unrepentant.
The interesting aspect of this solution is that while it protects the hearth space by blocking the channel, it doesn’t seem to help with the feeling of pressure: in the case of people I know who have adopted this or more extreme strategies (one woman threw her mobile phone into the Thames), the results are the opposite of what is intended. The phone, inactive, becomes an accusation or a harbinger of doom. ‘What if someone’s really trying to reach me in an emergency?’ I turn my mobile off overnight, on the basis that anyone who really needs me at 4 a.m. has my home number. The landline phone is next to the bed. As in most things, I’m midway between the Always-On generation (young Americans overwhelmingly go to bed with a mobile turned on and close to hand or even under their pillows, which according to a 2008 study might have a negative effect on sleep, leading ultimately to ADHD-like symptoms, although a larger experiment a year earlier found ‘no support for the notion that the aversive symptoms attributed to mobile phone signals by hypersensitive individuals are caused by exposure to such signals’)1 and the previous one.
But the spectacle of people on buses, in the street, with their children, alone on park benches, bending over their handsets and tapping at the keys – and the awareness that in one’s own pocket there’s a device very much the same which is even now bringing in messages from friends, colleagues, bosses and commercial entities, 24/7, every week of the year – can create a feeling of oppression. The intrusion of work into the hearth space, of responsibility into the place where professional responsibility is supposed to be shelved in favour of home life, is particularly hard to handle. It’s worth noting that the idea that work messages must be answered immediately has more to do with the nature of our relationship with capitalism than it does with our understanding of technology, but, in a sense, that’s a dodge: capitalist enterprise will expand to fill the available space, and in any case, the technology and the culture have grown up together. And on the subject of noise, the nay-sayers certainly have a point. Fifty million tweets and 200 billion (yes, billion) emails are sent every day. We are generating more communication now every few days than we did from the dawn of human history until 2003. And much of it – as science fiction novelist Theodore Sturgeon could have told us – is crap.
Somehow, we need to come to terms with the influx; and switching off doesn’t appear to be the answer. Refusing to connect is like refusing to open your post: it doesn’t solve the problem, it just leaves you ignorant of what’s happening, and gradually the letters pile up on the mat.
Alongside the sense of intrusion is a gnawing fear that the modern world quite simply contains too much we ought to know, or need to know. It’s not obviously a digital issue; rather, it’s a consequence of the ethos of factual inquiry which comes from the scientific and technological current in our society. Issues of how we feel are not clear-cut or always entirely logical, but it seems to me that the blame for this aspect of information overload is cast on technology for its place as a part of the scientific family.
And inquiry certainly does yield complexity, because we inhabit a world which is complex. In academic study and practical research we have pushed back the boundaries of ignorance as the Enlightenment promised that we would. In consequence there is so much more to learn in every sphere of life that we either become hyper-specialized or, in choosing a broader spectrum of knowledge, accept that we cannot know everything which is to be known about our subjects. When I was a child, I was told on a museum visit that Sir Isaac Newton was the last man to know the entire field of mathematics as it stood in his time. After Newton, the story went, it was simply impossible to absorb it all. Since then I’ve heard the same thing proposed about Carl Friedrich Gauss, Gottfried Leibniz, and a half-dozen others. It doesn’t really matter which of them – if any – genuinely deserves the title. The point is that no one now can claim it, or anything like it. In 1957 Colin Cherry wrote in On Human Communication:
Up to the last years of the eighteenth century our greatest mentors were able not only to compass the whole science of their day, perhaps together with mastery of several languages, but to absorb a broad culture as well. But as the fruits of scientific labor have increasingly been applied to our material betterment, fields of specialized interest have come to be cultivated, and the activities of an ever-increasing body of scientific workers have diverged. Today we are most of us content to carry out an intense cultivation of our own little scientific garden (to continue the metaphor), deriving occasional pleasure from chat with our neighbors over the fence, while with them we discuss, criticize, and exhibit our produce.
If it was true then that knowledge had outstripped our capacity to retain and process it, it’s vastly more so now. Universities complain that they cannot bridge in three or four years the gap between the end of the school syllabus and the place where new work is being done, either in the commercial sector or in Academe. The quantity of information and theory available is boggling, so that on any given topic there may be multiple schools of conflicting thought, each of them large enough to be a lifetime’s study by itself. The situation of any project with a broad scope is analogous to that of an artist painting the Alps: she tries to capture the scale of the peaks, the colour of the sky, the appalling drop to the valley floor, but has no hope of accurately rendering the village in the distance or the great swathe of
landscape directly behind her back. Moreover, the picture will reproduce only the visual scene, not the scent, the sound, the taste of the air or the texture of the rock. The other senses can only be suggested.
The most egregious example of a glut of complex issues all bound to one another, though, is probably government – by which, inevitably, I also mean politics. Any claim by one party will be furiously rejected by another, and both claim and counter-claim will be couched in terms that are either incomprehensible on the face of it or ostensibly clear-cut but somehow freely interpretable. Worse yet – the final part of the information overload problem – no issue occurs in isolation. Issues which are themselves complex and require complex solutions are connected to others which appear to pull in the opposite direction: political programmes inevitably have to be paid for, creating what appears to be a budgetary zero-sum game in which a positive must be measured against a corresponding negative – the hope being that the consequences of the first will leverage the consequences of the second and we can all go up a level. More often they seem to drag one another down.
At the same time, some or all programmes will have unforeseen and unforeseeable consequences, good or bad. In Freakonomics, economist Steven D. Levitt and author Stephen J. Dubner trace the unexpected consequences of incentives and apparently unrelated social policies. The paths they follow are convoluted, but the lesson is that everything is connected – according to Freakonomics, the failure of the ‘urban superpredator’ to appear and make the streets of American cities unsafe in the 1990s can be traced not to programmes of education or tougher juvenile sentencing, but to the legalization of abortion in the 1970s – and while the connections are often unexpected they are powerful and close. The human world is not a loose-knit bundle of strands from which one can be plucked out, but a snarl of cross-connected threads woven together by centuries. Our social systems, after all, are not created by a design team but evolved to cope with changing conditions and forever struggling to catch up.
The only way through the maze might seem to be to go back to the source and try to build your comprehension from scratch, but that’s almost impossible; quite often you’d need years even to understand the questions, let alone acquire a full understanding of the opposing positions. And yet without that understanding, how can you decide whether you believe in – for example – proportional representation voting systems? The pros and cons of a flat tax, the national need for a nuclear deterrent, or membership of the European Union? The stakes are so high, and yet the answers seem to be utterly mired in complexity. The broadcast television news was bad enough, but now every social networking site includes feeds and miniature party political broadcasts, debates and opinions about issues local and global which seem to have a direct connection to our lives – indeed, they seem to propose our personal complicity in decisions of which we greatly disapprove. There’s an obligation upon us, surely, if the information is there, to inform ourselves about our moral liabilities and act.
And yet, you could spend a lifetime doing so. From fish to cocoa to cars to wood, everything has a narrative, and not all of those narratives are happy ones. That was fine while we could imagine ourselves isolated from ill-doings far away, but nowhere is far away any more. The chain of connection from our homes to the war zones of the Middle East and Africa is horribly short. Once more, the hearth is touched by things which belong in other spaces. The television news brought the Ethiopian famine of the 1980s to the living rooms of Britain. Images of starvation in another hearth space came home to the fireplaces and buttered crumpets of the UK, and the response was huge. But now it seems everywhere is broken, and it’s too much to take in, because the net of connection implicates the lifestyles of the industrialized nations in the suffering of others. The hearth itself, which is supposed to be a place of refuge from the world, seems to be purchased at the cost of pain in the world. Every decision – even what fruit to buy, what brand of tea, or whether to eat beef or chicken, what it means to buy from a given supermarket – is part of every other, and all of them seem to have disproportionate knock-on effects in unpredictable places. The simplest questions have acquired nuance, controversy and multiple interpretations. We are at sea.
And the ship has a hole in it. Writers such as Dan Ariely have shown us that we can’t even trust our own basic rationality. In Predictably Irrational, Ariely discusses how we make irrational choices about pricing in predictable ways: having seen an ‘anchor’ number, we judge everything against it – even if the anchor is nothing more than a vehicle registration. An absurdly high anchor will cause us to think of even a substantial lower price as acceptable and a normal one as a bargain (many restaurant menus these days have one super-expensive item because it makes the next price down look acceptable).
More generally, some of us desperately seek to block out facts that unsettle us – the same instinct which inspired my friend to lob her trusty Nokia off Westminster Bridge – hence the global and increasingly absurd market for climate change denial. Heralded as heroes are such curious characters as Australian geologist Ian Plimer, whose book Heaven and Earth relied for its ability to ‘debunk’ the idea of global warming on a theory that insists that the sun is largely made of iron. Michael Ashley, Professor of Astrophysics at New South Wales, lamented ‘the depth of scientific ignorance’ in Plimer’s book, ‘comparable to a biologist claiming that plants obtain energy from magnetism rather than photosynthesis’. And yet the appetite for such unlikely claims remains unaffected. Huge numbers of us are apparently anchored to an idea of the world the way we want it to be. Recognizing the truth is painful, so we don’t. Of course, if you’re sticking your fingers in your ears and shouting ‘lalalala’ about climate change, any number of topics – from ocean acidification to fish stocks to international development – will abruptly become part of your information overload noise problem.
Everything in our world is in doubt, bringing on a sort of lifestyle dissonance – the extended hearth encompasses uncomfortable and inconvenient truths. We are increasingly aware that the food we eat is bad for us; that the money we earn and spend feeds into and comes out of a banking system whose goal is not stewardship but lottery win success, and whose excesses can create and then abruptly annihilate enormous sums; that the planet itself is not a fixed point but a collection of vital systems we are woefully overtaxing; that our national wealth derives not (or not only) from virtue but also from a privileged post-colonial position which we have adeptly exploited, but which now brings us danger and violence; that our governments sell or facilitate the sale of guns, execution drugs and manacles to states whose actions we publicly oppose in exchange for the oil we need to continue the cycle.
Every action of our lives carries a tacit burden of complexity; and digital technology possesses the ability to bring it to the forefront: to report it live, to bring to our notice obscure but poignant crises, to connect us to matters far away and make the problems of people we do not know seem close. Objects can be tagged (virtually or physically), their narratives made explicit, the stories of those who created them in sweatshops can be hovering at your shoulder as you buy. Your neighbours are no longer the people who live next to you but whoever you talk to online; people who share your interests and dreams may as well be in Karachi as in Paris, London or New York. The evictions of the Occupy Wall Street protests have made global villains of heavy-handed police officers who a few years ago would barely have been remembered six months later by those they arrested. If you had been following a Pakistani IT contractor named Sohaib Athar on Twitter at around 9:30 pm GMT on 1st May 2011, you would have read his grouchy discontent about loud helicopters over his house. Athar didn’t know it at the time, but he was tweeting live coverage of the US raid on Osama bin Laden’s Abbottabad compound.2 What would have been, as recently as 2005, a military action in a foreign land, became something happening just down someone’s street.
In a digital world, nothing is simple any more. The world is slippery and hard to pin do
wn. Everything is out of control.
This feeling that we’re not in the driving seat of our own lives – of discomfort with a world that is rushing past us, with the pace of change and with the impossibility of affecting its course – is not new. Anthony Giddens wrote in The Consequences of Modernity in 1990 – safely before the digital avalanche began – that many of us have a sense ‘of being caught up in a universe of events we do not fully understand, and which seems in large part outside of our control’. A decade earlier, American doctor Larry Dossey coined the term ‘time-sickness’ to describe the effects of the increasingly rapid pace of life in the 1980s. Dossey’s work isn’t viewed with unalloyed approval by the medical community, but that doesn’t change the fact that he identified a perception that ‘time is getting away, that there isn’t enough of it, and that you must pedal faster and faster to keep up’ which resonated – and continues to resonate – with a lot of people.
Ten years before Dossey, Alvin Toffler wrote about it as ‘future shock’, which he defined as ‘the shattering stress and disorientation that we induce in individuals by subjecting them to too much change in too short a time’. This phenomenon has existed as long as I’ve been alive, and I suspect you can find it in every decade. I remember, when I was at school, seeing a tablet from an ancient civilization (I think it was Sumer) lamenting that things were better in the old days when children really respected their parents. Perhaps it’s inevitable, an artefact not of changes in society but in the life of commentators: we start out young and rich in leisure time, then form stable relationships and get proper jobs, then have children, and gradually have less and less time devoted to ourselves and more to external – if beloved – things.
Of course, if events are increasingly complex, or if their complexity is increasingly apparent to us through better access to information, seeking understanding of them would be increasingly time-consuming and difficult, and hence, also, gaining control of them.