“Over the past few years,” Carr wrote, “I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going – so far as I can tell – but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.”
The title of the essay is misleading, because Carr’s target was not really the world’s leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.
Carr’s article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he’s an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking. Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it’s needed again? The web has become, in a way, a global prosthesis for our collective memory.
It’s easy to dismiss Carr’s concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.
But just because fears recur doesn’t mean that they aren’t valid. There’s no doubt that communications technologies shape and reshape society – just look at the impact that printing and the broadcast media have had on our world. The question that we couldn’t answer before now was whether these technologies could also reshape us. Carr argues that modern neuroscience, which has revealed the “plasticity” of the human brain, shows that our habitual practices can actually change our neuronal structures. The brains of illiterate people, for example, are structurally different from those of people who can read. So if the technology of printing – and its concomitant requirement to learn to read – could shape human brains, then surely it’s logical to assume that our addiction to networking technology will do something similar?
Not all neuroscientists agree with Carr and some psychologists are sceptical. Harvard’s Steven Pinker, for example, is openly dismissive. But many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre’s Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that “people’s use of the internet has enhanced human intelligence“.
Others argue that the increasing complexity of our environment means that we need the net as “power steering for the mind”. We may be losing some of the capacity for contemplative concentration that was fostered by a print culture, they say, but we’re gaining new and essential ways of working. “The trouble isn’t that we have too much information at our fingertips,” says the futurologist Jamais Cascio, “but that our tools for managing it are still in their infancy. Worries about ‘information overload’ predate the rise of the web… and many of the technologies that Carr worries about were developed precisely to help us get some control over a flood of data and ideas. Google isn’t the problem – it’s the beginning of a solution.”
Sarah Churchwell, academic and critic
Is the internet changing our brains? It seems unlikely to me, but I’ll leave that question to evolutionary biologists. As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn’t the same thing as changing our brains. The brain is like any other muscle – if you don’t stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.
In one sense, the internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you’ve worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it’s at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.
In 10 years, I’ve seen students’ thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.
But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We’ve all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.
In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn’t discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet’s power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.
Sarah Churchwell is a senior lecturer in American literature and culture at the University of East Anglia
Naomi Alderman, novelist
If I were a cow, nothing much would change my brain. I might learn new locations for feeding, but I wouldn’t be able to read an essay and decide to change the way I lived my life. But I’m not a cow, I’m a person, and therefore pretty much everything I come into contact with can change my brain.
It’s both a strength and a weakness. We can choose to seek out brilliant thinking and be challenged and inspired by it. Or we can find our energy sapped by an evening with a “poor me” friend, or become faintly disgusted by our own thinking if we’ve read too many romance novels in one go. As our bodies are shaped by the food we eat, our brains are shaped by what we put into them.
So of course the internet is changing our brains. How could it not? It’s not surprising that we’re now more accustomed to reading short-form pieces, to accepting a Wikipedia summary, rather than reading a whole book. The claim that we’re now thinking less well is much more suspect. If we’ve lost something by not reading 10 books on one subject, we’ve probably gained as much by being able to link together ideas easily from 10 different disciplines.
But since we’re not going to dismantle the world wide web any time soon, the more important question is: how should we respond? I suspect the answer is as simple as making time for reading. No single medium will ever give our brains all possible forms of nourishment. We may be dazzled by the flashing lights of the web, but we can still just step away. Read a book. Sink into the world of a single person’s concentrated thoughts.
Time was when we didn’t need to be reminded to read. Well, time was when we didn’t need to be encouraged to cook. That time’s gone. None the less, cook. And read. We can decide to change our own brains – that’s the most astonishing thing of all.
Ed Bullmore, psychiatrist
Whether or not the internet has made a difference to how we use our brains, it has certainly begun to make a difference to how we think about our brains. The internet is a vast and complex network of interconnected computers, hosting an equally complex network – the web – of images, documents and data. The rapid growth of this huge, manmade, information-processing system has been a major factor stimulating scientists to take a fresh look at the organisation of biological information-processing systems like the brain.
It turns out that the human brain and the internet have quite a lot in common. They are both highly non-random networks with a “small world” architecture, meaning that there is both dense clustering of connections between neighbouring nodes and enough long-range short cuts to facilitate communication between distant nodes. Both the internet and the brain have a wiring diagram dominated by a relatively few, very highly connected nodes or hubs; and both can be subdivided into a number of functionally specialised families or modules of nodes. It may seem remarkable, given the obvious differences between the internet and the brain in many ways, that they should share so many high-level design features. Why should this be?
One possibility is that the brain and the internet have evolved to satisfy the same general fitness criteria. They may both have been selected for high efficiency of information transfer, economical wiring cost, rapid adaptivity or evolvability of function and robustness to physical damage.Networks that grow or evolve to satisfy some or all of these conditions tend to end up looking the same.
Although there is much still to understand about the brain, the impact of the internet has helped us to learn new ways of measuring its organisation as a network. It has also begun to show us that the human brain probably does not represent some unique pinnacle of complexity but may have more in common than we might have guessed with many other information-processing networks.
Ed Bullmore is professor of psychiatry at the University of Cambridge
Geoff Dyer, writer
Sometimes I think my ability to concentrate is being nibbled away by the internet; other times I think it’s being gulped down in huge, Jaws-shaped chunks. In those quaint days before the internet, once you made it to your desk there wasn’t much to distract you. You could sit there working or you could just sit there. Now you sit down and there’s a universe of possibilities – many of them obscurely relevant to the work you should be getting on with – to tempt you. To think that I can be sitting here, trying to write something about Ingmar Bergman and, a moment later, on the merest whim, can be watching a clip from a Swedish documentary about Don Cherry – that is a miracle (albeit one with a very potent side-effect, namely that it’s unlikely I’ll ever have the patience to sit through an entire Bergman film again).
Then there’s the outsourcing of memory. From the age of 16, I got into the habit of memorising passages of poetry and compiling detailed indexes in the backs of books of prose. So if there was a passage I couldn’t remember, I would spend hours going through my books, seeking it out. Now, in what TS Eliot, with great prescience, called “this twittering world”, I just google the key phrase of the half-remembered quote. Which is great, but it’s drained some of the purpose from my life.
Exactly the same thing has happened now that it’s possible to get hold of out-of-print books instantly on the web. That’s great too. But one of the side incentives to travel was the hope that, in a bookstore in Oregon, I might finally track down a book I’d been wanting for years. All of this searching and tracking down was immensely time-consuming – but only in the way that being alive is time-consuming.
Colin Blakemore, neurobiologist
It’s curious that some of the most vociferous critics of the internet – those who predict that it will produce generations of couch potatoes, with minds of mush – are the very sorts of people who are benefiting most from this wonderful, liberating, organic extension of the human mind. They are academics, scientists, scholars and writers, who fear that the extraordinary technology that they use every day is a danger to the unsophisticated.
They underestimate the capacity of the human mind – or rather the brain that makes the mind – to capture and capitalise on new ways of storing and transmitting information. When I was at school I learned by heart great swathes of poetry and chunks of the Bible, not to mention page after page of science textbooks. And I spent years at a desk learning how to do long division in pounds, shillings and pence. What a waste of my neurons, all clogged up with knowledge and rules that I can now obtain with the click of a mouse.
I have little doubt that the printing press changed the way that humans used their memories. It must have put out of business thousands of masters of oral history and storytelling. But our brains are so remarkably adept at putting unused neurons and virgin synaptic connections to other uses. The basic genetic make-up of Homo sapiens has been essentially unchanged for a quarter of a million years. Yet 5,000 years ago humans discovered how to write and read; 3,000 years ago they discovered logic; 500 years ago, science. These revolutionary advances in the capacity of the human mind occurred without genetic change. They were products of the “plastic” potential of human brains to learn from their experience and reinvent themselves.
At its best, the internet is no threat to our minds. It is another liberating extension of them, as significant as books, the abacus, the pocket calculator or the Sinclair Z80.
Just as each of those leaps of technology could be (and were) put to bad use, we should be concerned about the potentially addictive, corrupting and radicalising influence of the internet. But let’s not burn our PCs or stomp on our iPads. Let’s not throw away the liberating baby with the bathwater of censorship.
Colin Blakemore is professor of neuroscience at the University of Oxford
Ian Goodyer, psychiatrist
The key contextual point here is that the brain is a social organ and is responsive to the environment. All environments are processed by the brain, whether it’s the internet or the weather – it doesn’t matter. Do these environments change the brain? Well, they could and probably do in evolutionary time.
The internet is just one of a whole range of characteristics that could change the brain and it would do so by altering the speed of learning. But the evidence that the internet has a deleterious effect on the brain is zero. In fact, by looking at the way human beings learn in general, you would probably argue the opposite. If anything, the opportunity to have multiple sources of information provides a very efficient way of learning and certainly as successful as learning through other means.
It is being argued that the information coming into the brain from the internet is the wrong kind of information. It’s too short, it doesn’t have enough depth, so there is a qualitative loss. It’s an interesting point, but the only way you could argue it is to say that people are misusing the internet. It’s a bit like saying to someone who’s never seen a car before and has no idea what it is: “Why don’t you take it for a drive and you’ll find out?” If you seek information on the internet like that, there’s a good chance you’ll have a crash. But that’s because your experience has yet to inculcate what a car is. I don’t think you can argue that those latent processes are going to produce brain pathology.
I think the internet is a fantastic tool and one of the great wonders of the world, if not the greatest. Homo sapiens must just learn to use it properly.
Ian Goodyer is professor of psychiatry at the University of Cambridge
Maryanne Wolf, cognitive neuroscientist
I am an apologist for the reading brain. It represents a miracle that springs from the brain’s unique capacity to rearrange itself to learn something new. No one, however, knows what this reading brain will look like in one more generation.
No one today fully knows what is happening in the brains of children as they learn to read while immersed in digitally dominated mediums a minimum of six to seven hours a day (Kaiser report, 2010). The present reading brain’s circuitry is a masterpiece of connections linking the most basic perceptual areas to the most complex linguistic and cognitive functions, like critical analysis, inference and novel thought (ie, “deep reading processes”). But this brain is only one variation of the many that are possible. Therein lies the cerebral beauty and the cerebral rub of plasticity.
Understanding the design principles of the plastic reading brain highlights the dilemma we face with our children. It begins with the simple fact that we human beings were never born to read. Depending on several factors, the brain rearranges critical areas in vision, language and cognition in order to read. Which circuit parts are used depends on factors like the writing system (eg English v Chinese); the formation (eg how well the child is taught); and the medium (eg a sign, a book, the internet). For example, the Chinese reading brain requires more cortical areas involved in visual memory than the English reader because of the thousands of characters. In its formation, the circuit utilises fairly basic processes to decode and, with time and cognitive effort, learns to incorporate “deep reading processes” into the expert reading circuit.
The problem is that because there is no single reading brain template, the present reading brain never needs to develop. With far less effort, the reading brain can be “short-circuited” in its formation with little time and attention (either in milliseconds or years) to the deep reading processes that contribute to the individual reader’s cognitive development.
The problem of a less potentiated reading brain becomes more urgent in the discussion about technology. The characteristics of each reading medium reinforce the use of some cognitive components and potentially reduce reliance on others. Whatever any medium favours (eg, slow, deep reading v rapid information-gathering) will influence how the reader’s circuit develops over time. In essence, we human beings are not just the product of what we read, but how we read.
For me, the essential question has become: how well will we preserve the critical capacities of the present expert reading brain as we move to the digital reading brain of the next generation? Will the youngest members of our species develop their capacities for the deepest forms of thought while reading or will they become a culture of very different readers – with some children so inured to a surfeit of information that they have neither the time nor the motivation to go beyond superficial decoding? In our rapid transition into a digital culture, we need to figure out how to provide a full repertoire of cognitive skills that can be used across every medium by our children and, indeed, by ourselves.
Maryanne Wolf is the author of Proust and the Squid: The Story and Science of the Reading Brain, Icon Books, 2008
Bidisha, writer and critic
The internet is definitely affecting the way I think, for the worse. I fantasise about an entire month away from it, with no news headlines, email inboxes, idle googling or instant messages, the same way retirees contemplate a month in the Bahamas. The internet means that we can never get away from ourselves, our temptations and obsessions. There’s something depressing about knowing I can literally and metaphorically log on to the same homepage, wherever I am in the world.
My internet use and corresponding brain activity follow a distinct pattern of efficiency. There’s the early morning log-on, the quick and accurate scan of the day’s news, the brisk queries and scheduling, the exchange of scripts of articles or edited book extracts.
After all this good stuff, there’s what I call the comet trail: the subsequent hours-long, bitty, unsatisfying sessions of utter timewasting. I find myself looking up absolute nonsense only tangentially related to my work, fuelled by obsessions and whims and characterised by topic-hopping, bad spelling, squinting, forum lurking and comically wide-ranging search terms. I end having created nothing myself, feeling isolated, twitchy and unable to sleep, with a headache and painful eyes, not having left the house once.
The internet enables you look up anything you want and get it slightly wrong. It’s like a never-ending, trashy magazine sucking all time, space and logic into its bottomless maw. And, like all trashy magazines, it has its own tone, slang and lexicon. I was tempted to construct this piece in textspeak, Tweet abbreviations or increasingly abusive one-liners to demonstrate the level of wit the internet has facilitated – one that is frighteningly easily to mimic and perpetuate. What we need to counteract the slipshod syntax, off-putting abusiveness, unruly topic-roaming and frenetic, unreal “social networking” is good, old-fashioned discipline. We are the species with the genius to create something as wondrous as the internet in the first place. Surely we have enough self-control to stay away from Facebook.
• This articles was amended on 27 August 2010 to change “quarter of a billion years” to “quarter of a million years”.