Skip to Content

The Computer/Brain Partnership – Balance Required for Maximum Efficiency

There is increasing awareness that for optimum intellectual function, a balanced relationship must exist between the use of our electronic linkages and our brains. Excessive use and over-dependence on the Internet, e-mail, smartphones, and Web-enabled tablets can hinder instead of helping progress toward a successful science career.

This is a complex and contentious issue. It was addressed by Nicholas Carr, a respected technology writer, with a 2008 article in The Atlantic titled “Is Google making us stupid?” and  expanded upon in his recent book, The Shallows:  What the Internet is Doing to Our Brains (W. W. Norton, 2010). Carr believes that “we are trading away the seriousness of sustained attention for the web’s frantic superficiality.” As a researcher, advisor, and student mentor, I see this happening in many struggling students and young scientists.


Our computers and electronic linkages have offer well-established advantages in education and research, including:

  • Easy, fast access to and storage of an infinite range of information, including the very latest.
  • The capacity to connect  and communicate with old and new contacts and then to stay connected.

However,
these important benefits can interfere with how we use our brains in a
number of ways. We can become overloaded with information and lose focus
and perspective. Constant interruption means less opportunity for
deliberation.
 
The result is that these potentially valuable tools can make our brains less productive and creative, not more.

These
challenges and their consequences are exemplified by two people I knew
well during my years of training, a time when computers played a much
smaller role in education and science. The first was an outstanding
student in my medical school class who was blessed with a
near-photographic memory. He graduated knowing more facts and data than
anyone in the class and began one of the most prestigious and
challenging internships in the country. Three months later, he dropped
out of the program. His formidable factual knowledge could not translate
into timely diagnoses and prescriptions for proper treatment.

A
second friend, whom I met during my pathology fellowship, has a
similarly effective memory — but she was able to integrate it with a
cognitive thought process to become one of today’s most eminent
pathologists, making new connections and associations among reports at
meetings and in the literature and integrating them with her own
observations.

People use the Internet as a personal memory bank — the so-called “Google effect.” In a recent study reported online in Science
(July 15, 2011: Vol. 333 no. 6040 p. 277), Betsy Sparrow noted that our
brains are better at remembering where information is stored than in
remembering the information itself.

For a struggling student,
knowing where to find information is not enough. Competency in the
classroom, in the laboratory, and on examinations requires a basic core
of knowledge in one’s own memory and here one must rely on good decision
making (i.e., “discretion” or “judgment”) in concentrating on the best
sources to study and the apportionment of one’s time. The decisions to
be made include:

  • Which lectures to attend in person or watch as video recordings on the Internet
  • How much time to spend reading notes and selected readings
  • How much additional effort to devote to articles in the literature

Ready access to such huge quantities of information has potentially hazardous effects. The pitfalls to be avoided include:

  • It
    is easy to become overwhelmed and intimidated by the search for
    information. Seek help or advice from a mentor or advisor to help guide
    you.
  • Don’t become obsessive in searching for information as this
    can become a distraction from solving the problem. It is important to
    know when you have enough information and can stop your search.
  • Resist
    the temptation to “cyberloaf” and engage in recreational Internet
    surfing and personal messaging–all of which are very hard to avoid for
    most people.

The medical or graduate student must have
self-discipline, good study habits, and a strong motivation to succeed
in order to avoid these pitfalls.

The ubiquity of information
technology in our lives has, famously, led to multitasking — that is,
to doing several things at once (or, more accurately, in short,
sequential bursts). Is this a good thing or a bad thing?  This
controversial topic is wisely and comprehensively discussed in the book,
Conquer CyberOverload: Get more Done, Boost Your Creativity, and Reduce Stress,
(CyberOutlook Press, 2009) by psychologist Joanne Cantor. Cantor offers
the tongue-in-cheek definition of multitasking: “the art of distracting
yourself from two things you’d rather not be doing by doing them
simultaneously.”

Most studies indicate that most activities
require conscious attention; the brain cannot fully focus when
multitasking. When you multitask, it takes longer to finish what you’re
doing and more errors are made than when the activities are done
sequentially. True, young adults can multitask more effectively than
older adults, because they have a larger working memory. But even young
adults would do better if they would focus on one task at a time.

The
most disturbing side effect of computer overdependence is that it
alienates us from what I consider to be higher cognitive processes —
not just remembering but interpreting, evaluating, planning, and
imagining. These are the key components of problem solving and
creativity, and they’re best accomplished away from your computer. A New Yorker article
by Jonah Lehrer (The Eureka Hunt: why do good ideas come to us when
they do? July 28, 2008, pp. 40-45), notes that loosening tight focus and
letting the mind wander — not the same thing as having it
pulled from one distraction to another — is essential for insight. And
while Internet sites such as Facebook may be useful for maintaining
connections, human relationships rarely thrive when mediated by
technology.