PhotoAn old proverb notes that a pessimist will say a glass is half-empty, whereas an optimist will say the glass is half-full. Another proverb observes that “every cloud has a silver lining,” which inspires wags to retort “Yup, and every silver lining has its cloud” or “all silver's destined to tarnish” or something similarly cynical.

Which are all ways of saying that modern life is full of trade-offs, with good and bad aspects to most things. Meanwhile, this whole “Internet/social media/instant worldwide communication for all” business is still brand-new by world historical standards – as of 2014, the majority of people alive can personally remember life before the Internet – and there's still huge disagreement regarding whether that's a good thing or a bad thing, overall.

The latest entry in the “maybe bad thing” category is discussed in this MediaPost blog entry titled “Social Media Makes Us Dumb, But Think We're Smart.” It summarizes a study which researchers at the University of Oregon published in the journal of the Royal Society. Super-short version: the more you rely on social connections for problem-solving, the more your own personal cognitive abilities suffer.

Or so the study results might indicate. Researchers divided 100 test subjects into five groups of 20-member “social networks” with various levels of connectivity. The subjects were then asked to solve some rather difficult “cognitive reflection tests.”

Turns out subjects scored much higher on the tests when they were allowed to ask their social-network connections for the answers – the more connected you are to your network, the more likely you are to get the right answer – but then, after using social connections to help them take the tests, the subjects tended to score more poorly once they had to take the tests by themselves.

Brain not engaged

PhotoHere's how the researchers summarized their results:

“When people make false intuitive conclusions and are exposed to the analytic output of their peers, they recognize and adopt this correct output. But they fail to engage analytical reasoning in similar subsequent tasks. Thus, humans exhibit an ‘unreflective copying bias,’ which limits their social learning to the output, rather than the process, of their peers’ reasoning.”

Interesting. But set that aside for a moment, and check out this September 2013 article from Slate, which asked, “Are search engines and the Internet hurting human memory?” and answered “Nope. It's much, much weirder than that.” (The “article” in question is actually an excerpt from Clive Thompson's book "Smarter than you think: How technology is changing our minds for the better.")

Here's a stripped-down and somewhat oversimplified summary: the critics and worrywarts who fret, “Oh dear, people are starting to rely on looking up facts online rather than committing them to memory” are absolutely correct — so far as that goes.

Does it matter?

Yet it doesn't really matter, because supplementing our memories with whatever facts we find online is just an expanded technological version of what people have done for as long as there have been people: rather than try storing the sum total of all human knowledge and ability in our own personal individual brain, we rely on our social networks (family, friends, neighbors, even civilization writ large) to share that burden with us.

If you are half of an “old married couple”—or know people who are—you've seen or participated in this yourself. Read this bit from Thompson's book and see if it doesn't sound familiar:

Harvard psychologist Daniel Wegner—and his colleagues Ralph Erber and Paula Raymond—first began to systematically explore “transactive memory” back in the ’80s. Wegner noticed that spouses often divide up memory tasks. The husband knows the in-laws' birthdays and where the spare light bulbs are kept; the wife knows the bank account numbers and how to program the TiVo. If you ask the husband for his bank account number, he'll shrug. If you ask the wife for her sister-in-law's birthday, she can never remember it. Together, they know a lot. Separately, less so. ...

The same thing occurs on a bigger scale with colleagues at work.

[Y]ou each begin to subconsciously delegate the task of remembering that stuff to the other, treating one’s partners like a notepad or encyclopedia, and they do the reverse. In many respects, Wegner noted, people are superior to notepads and encyclopedias, because we’re much quicker to query: Just yell a fuzzily phrased question across to the next cubicle (where do we keep the thing that we use for that thing?) and you’ll get an answer in seconds. We share the work of remembering, Wegner argued, because it makes us collectively smarter.

Of course, remembering and retrieving facts — whether by yourself or with others — isn't quite the same thing as using applied knowledge, skill or intelligence to solve challenging cognitive puzzles. Yet they do seem to share one trait in common: “You do much better with others than you do by yourself.” That's the glass-half-full interpretation, anyway; you could also say “I do much worse by myself than when I get help from others.”


Share your Comments