Two thousand years ago, the Roman philosopher Seneca used a charming metaphor to describe the way memory shapes intellect. “We should imitate bees,” he wrote; “we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state.” As his metaphor makes clear, Seneca viewed memory not as a mere container but as a crucible. Memory was more than the sum of things remembered. It was something newly made, the essence, even, of a singular self.
When we talk of memory today, we tend to use a much less flowery metaphor. We don’t talk of bees and nectars. We talk of databases and search engines. Indeed, as we’ve grown more dependent on the vast stores of data inside our computers and out on the Internet, we’ve begun to blur the distinction between computer memory and biological memory. We’ve come to view the Web as an “outboard brain,” to borrow a phrase from the writer Clive Thompson, which not only extends the scope of our personal memory but can actually replace it. Why rely on the imperfect memory inside our heads when we can Google the precise bit of information we need at the precise moment we need it?
In embracing this new metaphor, we may be changing more than just the way we talk; we may be changing the way we behave. Last year, Science published an intriguing paper about the Internet’s influence on human thought. Called “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips,” the paper reported on the results of a study by a group of research psychologists, led by Betsy Sparrow of Columbia University, that sought to answer a critical question: Does our awareness of our ability to use search engines to find information alter the way our brains form memories? The answer, they discovered, is yes: “when people expect to have future access to information, they have lower rates of recall of the information.” The findings suggest “that processes of human memory are adapting to the advent of new computing and communication technology.”
In one of the experiments, people read forty statements of obscure facts (e.g., “an ostrich’s eye is bigger than its brain”) and then typed them into a computer. Half the participants were told the computer would save what they typed, and half were told that it wouldn’t. Afterwards, they were asked to write down as many of the statements as they could remember. The experiment revealed that people who believed the statements would be stored in the computer had a weaker memory of the information than those who assumed that the statements would not be stored. The researchers observed that the the participants “apparently did not make the effort to remember when they thought they could later look up the trivia statements they had read.” And they offered a broader conclusion: “Since search engines are continually available to us, we may often be in a state of not feeling we need to encode the information internally. When we need it, we will look it up.”
We humans have, of course, always had external, or “transactive,” information stores to supplement our biological memory. These stores can reside in the brains of other people we know (if your friend Julie is an expert on gardening, then you know you can use her knowledge of plant facts to supplement your own memory) or in media technologies such as maps and books and microfilm. But we’ve never had an “external memory” so capacious, so available and so easily searched as the Web. If, as this study suggests, the way we form (or fail to form) memories is deeply influenced by the mere existence of outside information stores, then we may be entering an era in history in which we will store fewer and fewer memories inside our own brains.
The loss of internal memory wouldn’t much matter if, as our new metaphor suggests, a fact stored externally were the same as a memory of that fact stored in our mind. But the metaphor is flawed. A computer database and biological memory are not the same things. When we form, or, as brain scientists say, “consolidate,” a personal memory, we also form associations between that memory and the myriad other memories — of facts, experiences, emotions — contained in our minds. These intricate connections, unique to ourselves, are indispensable to the development of deep, conceptual knowledge. The associations, moreover, continue to change with time, as we learn more and experience more. Biological memory is anything but a database. Its richness lies in its contingency.
Seneca’s seemingly quaint metaphor for memory, with its emphasis on the organic, indeterminate process of “mingling,” is, it turns out, remarkably apt. In fact, it seem to be more fitting than our new, fashionably high-tech metaphor, which equates memory with the precisely defined bits of digital data stored in computers. The essence of personal memory is not the discrete facts or experiences we store in our mind but the endless mingling of those facts and experiences. What is the self but the unique pattern that arises from that mingling?
The Internet is a wonderful supplement to memory. But if we see it, and use it, as a substitute for memory, we risk losing or at least diminishing something very important about ourselves. Metaphors can be misleading as well as enlightening.
Nicholas Carr is the author of The Shallows: How the Internet is Changing the Way We Think, Read and Remember. This article draws on material from that book as well as from his blog, Rough Type.