When Nicholas Carr’s article “Is Google Making Us Stupid?” hit newsstands in the July/August 2008 edition of The Atlantic, the reaction was predictably vociferous.
The essay itself – a 4,175 word editorial monolith of the kind The Atlantic does so well – was a thoughtful exploration of the fear that heavy reliance upon the internet is detrimental to certain cognitive faculties, including (but not limited to) concentration, memory and the capacity for quiet reflection.
The article was immediately met with a barrage of responses, from cautious endorsement to suggestions Carr was espousing a “moronic” Luddism.
It has been four years since that particular tempest in a teacup, but these concerns still resonate.
If nothing else, there is certainly no shortage of evidence in favour of Carr’s observations that the internet is changing our relationship with information in some fairly profound ways.
In a study published in Science in 2011, US scientists claimed the internet has become a form of “external or transactive memory”, with information being stored outside ourselves.
In the face of this transition, the imperative to remember information has instead been replaced with the imperative to remember where information is located.
This is what is commonly known as “the Google effect”, and is the motivating observation behind theories of extended cognition, such as those of philosophers Andy Clark and David J. Chalmers in their paper “The Extended Mind”.
Meanwhile, another study, conducted by UCLA Professor of Psychiatry Gary Small, showed experienced users of the internet demonstrated increased brain activity in parts of the prefrontal cortex associated with problem-solving and decision-making, as opposed to novice users.
These changes were not manifest when the two groups were asked to read printed text.
In Small’s words, this provides evidence for the fact “the current explosion of digital technology not only is changing the way we live and communicate, but is rapidly and profoundly altering our brains.”
I, for my part, am not yet convinced this is a cause for alarm. No doubt the brain responded with similar plasticity with the development of material culture (about 2.6 million years ago), language (somewhere 50,000 and 200,000 years ago) and writing (circa 4,000 BC).
The observation that our relationship with information can change based upon our artefacts seems the merest fact, and by no means a necessary cause for dismay.
That being said, it is my claim that the widespread use of these devices is necessarily changing what it means to be a successful learner.
When I was but a wee lad, my mother used to regale me with horror stories about her classroom experiences in the early 1960s.
Times tables and chemical formulae were learned by rote and devoid of inferential and internal coherence. Information was divorced from praxis, with each claim nacreous and glib, like banal pearls.
Of course, we would like to think contemporary pedagogy has overcome those shortcomings – and indeed they have, to a large extent.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.