Thu
Jan 2 2014 3:00pm

Resistance is Futile: Scientific American Explores How the Internet is Changing Your Brain

Until recently, humans have relied on each other to distribute and share memory, in a world where the human brain was the pinnacle of data storage. But the Internet has radically and rapidly changed our relationship with this transactive memory system. In the December issue of Scientific American, Daniel M. Wegner and Adrian F. Ward explore the phenomenon in “How Google is Changing Your Brain.”

“Human! We used to be exactly like them. Flawed. Weak. Organic. But we evolved to include the synthetic. Now we use both to attain perfection. Your goal should be the same as ours.”

–Borg Queen, Star Trek: First Contact

For those of us who recall the shadowy time before the rise of the Internet and Google, if you had a question, you were promptly sent to the dictionary, encyclopedia, or library (up hill, in the snow, both ways) to try to find the answer. Today, a question barely has time to cross our minds before we are tapping away on our phones or computers to Google the answer. When a proper noun becomes a verb, you know something big has happened.

Though many facets of human life and industry have changed as a result of the Internet, one of the areas that may feel the deepest long-term impact is human memory. Pre-Internet, humans relied on each other for a wide range of information. By spreading out the responsibility of memory to individuals, the entire group benefited, “each member [had] access to knowledge both 
broader and deeper than could be obtained alone.” Members were responsible for different types of information; and they didn’t just know the information for which they were responsible, they also knew what information each of the other members of the group held.

Wegner and Ward describe the benefits of this distributed memory:

“This divvying up avoids needless duplication of effort and serves to expand the memory capacity of the group as a whole. When we off-load responsibility for specific types of information to others, we free up cognitive resources that otherwise would have been used to remember this information; in exchange, we use some of these resources to increase our depth of knowledge in the areas for
which we are responsible. When group members share responsibility for information, each member has access to knowledge both 
broader and deeper than could be obtained alone.”

It used to be that this distribution only happened human-to-human, and then books and other records were integrated as conduits into the memory system. But print research was laborious and time-intensive, especially as the information sought became more esoteric. However, the Internet—particularly in databases, like Wikipedia, and search engines, such as Google—has revolutionized information, both in accessibility and in speed. Wegner and Ward set out to measure some of the ways this has changed how humans recall information.

In one study, Wegner and Ward asked subjects to type 40 “memorable factoids” into a computer. Some of the participants were told the computer would be saving the facts, but the other half were told they would be deleted at the end of the experiment. Half of both groups were specifically asked to remember the information at the start of the experiment. Wegner and Ward found that the groups who were told the computer would save the information were much worse at remembering it. Even the segment of that group that was specifically asked to remember the information, still performed poorly on the memory test. “People seemed to treat the computer like the transactive memory partners...off-loading information to this cloud mind rather than storing it internally.”

It’s not just the reliance on cloud and/or computer storage that’s changing how we remember, “the immediacy with which a search result pops onto the screen of a smartphone may start to blur the boundaries between our personal memories and the vast digital troves distributed across the Internet.”

Wegner and Ward tested this and found that access to the Internet increases cognitive self-esteem. Essentially, using the Internet to find answers made people feel smarter, even when they were answering incorrectly. According to Wegner and Ward, this is not an unusual experience, “the Internet is taking the place not just of other people as external sources of memory but also of our own cognitive faculties…The advent of the 'information age' seems to have created a generation of people who feel they know more than ever before—when their reliance on the Internet means that they may know ever less about the world around them.”

Of course, there are risks inherent in dependence on a digital memory system—power and server outages, digital espionage and warfare, and—especially in the case of Wikipedia—human error. This is not to imply that our old human-to-human system was perfect either—if caveman Bob found himself on the wrong end of a mammoth tusk, it is unlikely anyone thought to back him up before leaving the campfire that day. Not to mention that human memory can be highly subjective and therefore prone to error.

We’re only beginning to understand the rise of the Internet’s role in human memory. How will it affect early learning and education? As technology shoulders more and more of our memory load, will schools continue to teach to tests, even as rote memorization becomes less meaningful? Perhaps teaching online research techniques, critical thinking, and independent problem solving will become more important—giving children the skills to effectively mine and evaluate the wealth of information at their fingertips. Wegner and Ward also wonder what effects this will have on our social structure? Since distributed memory also served as a way of binding a group, will reliance on digital memory weaken human ties to each other?

Even though, at the moment, it seems like humans are using this bonus of more information with less personal responsibility for remembering it to doge away the afternoons (much waste. wow.), Wegner and Ward theorize that eventually, this off-loading of human memory will free up cognitive capacity, which can be used to achieve loftier goals.

“[P]erhaps as we become parts of the ‘Intermind,’ we will also develop a new intelligence, one that is no longer anchored in the local memories that are housed only in our own brains. As we are freed from the necessity of remembering facts, we may be able as individuals to use our newly available mental resources for ambitious undertakings.”

Until then, we can continue to use the Internet as we always have—for settling arguments, cheating at bar trivia, and looking up that guy who was in that movie with that kid who used to date the girl from Misfits. Whatshisface. You know who I mean…


I am the beginning. The end. The one who is many. I am the Borg.

6 comments
twb
1. twb
The most unrealistic conception of techno-hive minds is, I think, the idea that it would create some kind of single-minded voracious super-organism instead of, say, a cascade of tweets, GIF memes, and listicles.

WE ARE BORG. RESISTANCE IS FUTILE. YOU'LL NEVER BELIEVE WHAT HAPPENS NEXT.
Matt Fimbulwinter
2. curgoth
twb: That is a great line! And first first urge was go go put it on twitter. Yup, we're doomed.
Chris Meadows
3. Robotech_Master
It's worth pointing out that practically anything you do that involves memory or learning "changes your brain" in some way. Learning to read changes your brain.
Shelly wb
4. shellywb
@3 Yes, but what is important is realizing how things change your brain. Repetitive use of google (or any other thinking task) wears paths of sorts into your brain that you continue to follow and strengthen the older you get, and it gets harder to change them. It's what depression does for example, and recognizing this helps a lot of patients jump off the path and form new ones, so to speak. So it's important I think to understand that Google is doing something, and figure out how it's changing the way we process information.
Shela Killian
5. Skillian
For me personally... I tend to read half an article then bookmark it for later. Then later never comes. So when the topic of it comes up in conversation, I have to pull a 'wait they worded it so much better' and then skim the rest before my friends think I'm an idiot. I never have to remember anything, just enough to find it again online. Information isn't processed, because it isn't even introduced. Why introduce yourself to it, when it is always readily available for when/if you need it? I think that is the subconscious mindset people are falling into.

I promise I read this whole article before commenting. I just thought I'd ramble what it made me think of. *nervous laugh*
twb
6. leeski
Has the human brain reached it growth limit? Using search and cloud could cause the brain to shrink due to disuse of areas related to memory and problem solving. Are we destined to become less intelligent as we rely on the ability to solve problems with a simple internet search from the ubiquitous soon to be omnipresent smart phone or like device?

Subscribe to this thread

Receive notification by email when a new comment is added. You must be a registered user to subscribe to threads.
Post a comment