Evolution to Distraction
Evolution to Distraction
Technology has been changing our lives since prehistoric times. With every new invention and the social change that results, there comes a new wave of doomsayers to condemn it. The simple truth is that our world is constantly evolving—and so are we. In his essay “In Defense of Distraction,” writer Sam Anderson illustrates and analyzes the current so-called “crisis of attention” and argues that maybe this shift to distraction is not a crisis at all, but a move to a new normal (504). Due to their effects on the brain, the growing prevalence of distraction and multitasking, which some perceive as solely negative impacts of technology, may actually be an evolution to a new method of paying attention.
As we “focus” on our cell phones , laptops and social networking sites, it seems we are constantly switching between things like Facebook, Instagram, Twitter, Tumblr, YouTube, texting, and that paper we need to write. According to Anderson, some have described the current “poverty of attention” as “chewing its hyperactive way through the very foundations of Western civilization” (504). That is to say, they believe that technological devices and the information age have ruined our traditional practices of learning, reading, studying, and focusing. As Anderson points out, they think we have become “mentally obese,” “bookless,” and “terminally distracted” (Anderson 504). Indeed, the abandonment of traditional learning methods is causing a widespread disparity between the teaching methods of schools and learning methods of their students. With a world of information at our fingertips, we don’t need to read or remember facts anymore, as we can simply look it up on our smartphones in far less time than it would take to find the same information in traditional sources.
Their criticism has many merits; however, Anderson responds to their pessimism by arguing that “conservative social critics have been blowing the apocalyptic bugle” at every technology invented since writing (504). Indeed, every major technology-induced societal shift is strongly opposed before it becomes the new norm. An excellent example would be electricity: in its early days, many thought electricity was dangerous because of its use in the electric chair, and tried to oppose its adoption into common households. This trend of opposition to technology implies that the current “crisis of attention” is nothing more than another initial opposition to a societal change brought about by technology (Anderson 504).
However, even if this is the apocalypse for attention, there’s no stopping it now. As Anderson points out, it’s really too late to oppose the change—in essence, it’s already happened and it’s impossible to return to the past (504). So even if this turns out to be a true crisis, it’s something we’re going to have to adapt to, as we are far too dependent on our computers, tablets, and cell phones for legitimate uses—regardless of the tempting distractions they offer—to ever give them up.
In response to the widespread, excessive use of the internet, some of these bugle-blowers have even suggested classifying internet addiction, one of the major ways we are “terminally distracted,” a mental disorder (Anderson 504). Anderson references how an editorial suggested adding addiction to the internet to the Diagnostic and Statistics Manual, which would make it “a disorder to be taken as seriously as schizophrenia” (509). To clarify, one possible meaning of this is that internet use is the most common form of distraction, and therefore it is the aspect of distraction we should focus on treating or addressing. Indeed, it seems that most of the time we waste being distracted is usually spent on the internet, on Facebook, YouTube, or other similarly distracting websites. By generalizing internet use to distraction as a whole, Anderson compares distraction itself to a mental disorder, and he further develops this idea when he mentions the use of drugs to help people focus and eliminate distraction—just like how people today use drugs to eliminate hallucinations.
The implication that distraction is a mental disorder goes beyond addiction to the internet, however. Anderson points out that drugs used to treat ADD and ADHD are abused by college students and professionals alike to help them focus (509). Some scientists have even proposed making these “neuroenhancers” legal and acceptable (Anderson 509). This proposition reinforces Anderson’s comparison of internet addiction—and distraction—to a mental disorder. However, if distraction is a mental disorder, most of the younger generations have it—their brains are being rewired for a constant state of distraction, as I’ll explain later. Needless to say, it would be illogical to define something so widespread as a mental disorder.
There are vastly different viewpoints on this subject, but no one really knows how to approach distraction—as a problem or not, mental or otherwise. This implies a need for further research into how we focus and get distracted in this modern age. It’s possible we will redefine a constant state of distraction as a mental disorder, or, more logically, we might redefine our concepts of attention and distraction. Distraction is seemingly poised to become the new attention, given how addicted we all seem to be to our internet and app-ridden lifestyles. This will have intriguing implications for society, particularly schools and the workplace. First it’s important to examine what the change in attention entails, the possible reasons behind it, and just how widespread the “poverty of attention” is (Anderson 504).
The shift in attention patterns seems to center around multitasking. Anderson interviews David Meyer, an expert on and researcher of multitasking, who believes we are in a crisis of attention, and that ‘it's going to get a lot worse than people expect’ (505). Meyer’s research indicates that multitasking, contrary to popular belief, isn’t doing multiple things at once successfully; it’s actually switching between them very frequently and rapidly (Anderson 506). Therefore our growing ability to multitask is an improved ability to switch between input channels—and there’s a clear reason why we’re getting better at this.
Technology requires us to be able to focus on multiple things at once. A clear example is the ability of your internet browser to open multiple tabs or windows. Anderson remarks that the “digital natives,” or kids who have grown up with computers, cell phones, and the internet, will have multitasking skills that older generations lack (513). Some examples Anderson provides include the ability to conduct multiple conversations simultaneously, or “[switch] between attentional targets in a way that’s been considered impossible” (513). In other words, these youth will have the ability to write an essay while listening to music, texting, instant messaging, trolling Facebook, and checking email—all at the same time.
Not only are we required to multitask, we are being biologically rewired for it. Matt Richtel’s article, “Growing up Digital, Wired for Distraction,” which appeared in the New York Times in 2010, builds upon Anderson’s concept of the “digital natives” (513). Perhaps most intriguing is the research results he cites, which establish that teenagers’ brains, as they are still developing, are ‘more easily habituated than adult brains to constantly switching tasks’ and as a result, are ‘less able to sustain attention’ (Richtel). As teenagers are younger, their brains are more pliable, and therefore they adapt far more easily to new situations. This is why the information age has had such a pronounced effect on the younger generations, and less of one on the older.
Michael Rich, executive director of the Center on Media and Child Health in Boston, argues that teenagers’ brains are more rewarded at a biochemical level for switching between tasks than for focusing on a single one, and therefore that ‘we’re raising a generation of kids in front of screens whose brains are going to be wired differently’ (Richtel). In essence, the youth who grow up in this era of multitasking will have brains better suited for switching between tasks than for focusing on a single one. Furthermore, Anderson quotes a neuroscientist who believes that ‘the human brain might be changing faster today than it has since the prehistoric discovery of tools,’ which implies that technology and the growth of multitasking is actually causing our brains to change on a biological level (513).
Our brain structure is shaped by our environment. Jim Valkenburg, manager of the Tutoring and Learning Center for Delta College, in his essay “Attention, Reflection and Distraction: The Impact of Technology on Learning,” quotes Michael Merzenich, a neuroscientist, who explains that our brains are constantly changing as a result of having to learn complex new skills—such as how to use technology, for instance (7). As a result, our brains ‘are vastly different, in fine detail, from the brains of our ancestors’ and this implies that our descendants’ brains might be very different from ours (Valkenburg 7). Merzenich points out that ‘the human brain changes according to the environment in which it functions’ as we form new connections between neurons and old connections wither away and are lost (Valkenburg 7). Therefore, since they will function in an even more highly technological environment than we do, the brains of our descendants—and perhaps even our own children—will be wired differently from ours and will be better suited to handle technology and its intrinsic demands.
In essence, Anderson, Richtel and Valkenburg are all arguing that the generations who are growing up with modern technology are composed of world-class multitaskers, and because of their highly technological surroundings, their brains are biologically better suited to distraction—to the exclusion of traditional methods of focusing on a single task. It is not outrageous to say that our brains, over generations, are rewiring themselves for multitasking and a constant state of distraction, as this is the environment in which we, and our youth, now live. Since our brains themselves are changing to be better adapted to technology—in essence, they are evolving—this implies we are not experiencing a crisis at all. Rather, our attention spans are simply leaning more towards distraction as we evolve to be better suited to multitasking. Now that we are learning to think in more parallel—multiple things at once—ways, society is going to have to adapt, specifically in the school systems and workplaces.
Our new wiring for distraction does not fit well with existing modes of instruction. In his article, Richtel acknowledges how teenagers today are spending more and more time plugged in to technology—whether it is via the cell phone, the internet, or video games—and outlines how students’ grades are falling because they can no longer focus enough to do their homework (Richtel). According to Richtel, modern devices, and the “constant stream of stimuli they offer, pose a profound new challenge to focusing and learning” (Richtel). In other words, Richtel believes that computers, cell phones, tablets, and other such devices offer a myriad of entertaining distractions for students that make it hard for them to focus on schoolwork, and as a result their academic achievements are declining. Similarly, Valkenburg addresses the issue of short-term memory, and points out that “short-term storage is fragile: A break in our attention can sweep its contents from our mind,” and this has vast implications for students trying to learn (6). New information can’t be solidified into long-term memory if a student is immediately distracted after learning it. These distractions can come in many forms: the cell phone buzzing with a new text message, a notification popping up on the screen from a new email, or someone tweeting a link to a website with a ten hour video of a cat. Richtel argues that students are constantly distracted, and Valkenburg argues that constant distractions make learning impossible. Essentially, our newfound inability to focus on a single task is out of line with the traditional methods of teaching and learning in school systems, and this misalignment is causing serious problems across the nation. However, there is something we can do about it.
Since technology is actually changing our brains and there’s a biological basis to the students’ troubles, the best course of action for school systems is to find a way to adopt distraction in the classroom as a method of connecting to students. This isn’t to say that schools need to keep their students’ minds entertained—they simply need to find ways of connecting the material to other, related information so the students’ brains have more information to chew on at one time. The students are no longer able to focus consistently on one concept, so the schools need to cater to the new modes of thinking and focusing.
However, there are mixed feelings about schools adapting to meet the students’ needs. For example, the principal of Woodside High has pushed to implement technology in the classrooms to engage the students (Richtel). In contrast, Mr. Eaton, the Latin teacher at Woodside, says ’It’s a catastrophe’ and believes technology is the root of the students’ attention problems, and that schools are exacerbating the problem when they adopt technology (Richtel). However, it is simply not a question anymore—students are now thinking and operating differently on a biological level, and trying to teach them in sequence—one thing at a time—when they think in parallel—multiple things at a time—is simply not going to work. Students will become increasingly better able to switch between input channels and less able to focus on a single one—thus, teaching practices will have to shift to accommodate this new pattern of attention. As our distracted students grow up and become teachers, this should pose less of a problem, as the teachers will think the same constantly distracted way the students do. But for now, the break between how students focus and how the teachers teach is causing severe problems.
There are also concerns regarding technology-addicted youth entering the workplace. Going by Anderson’s and Richtel’s articles, it is likely we will see more and more acceptance of the use of technology at all times and in all places, even business meetings, as the brains of the new generations are going to be more closely dependent upon technology use. Similarly to the school systems, workplaces will need to change their policies to accommodate the new ways the younger generations think and focus, and therefore work.
A study of technology in the workplace, “Physically Present, Mentally Absent: Technology Use in Face-to-Face Meetings” by Lisa Kleinman, Ph.D., of the University of Texas may shed some light upon how technology-minded students will currently find technology addressed in the workplace. In her study, Kleinman discovered an interesting array of responses to technology use during business meetings; for example, a software engineer who had rules against the use of laptops in business meetings, and a mailman who allowed his employees to answer their phones during the same (2503).
There is a wide variety of opinions about technology and its use; some participants mentioned the need to be constantly accessible to others, to look up information, or to supplement the meetings with pictures and prototypes (Kleinman 2503). This just goes to show how dependent we are upon our technology—we can use texting or instant messaging to be accessible even in otherwise inaccessible places, we have a wealth of information available at our fingertips, and we are able to engage the visual side of our brains using imaging technology to help us understand complex concepts. All of these might be considered distractions—but at the same time, they help us do our jobs better. For example, some managers were okay with interruptions as long as it related to work (Kleinman 2505). Being constantly accessible, although it might appear to be distracting, might actually be necessary to accomplish certain tasks or projects in the workplace that require group participation. Therefore it seems technology will find a more and more receptive audience in the workplace, especially as our dependence upon it continues to grow, and as our brains become more closely tied to its use.
The question we need to answer is whether the new wiring of our brains is a bad thing, or just a different thing. Either we’ll force students and workers to abandon technology and try to regain our ability to focus, or we’ll make the school systems and workplace adapt to the changing brains of their patrons. We can either accept distraction as the new mode of how our brains function, or we can label it a disorder and prescribe medications such as neuroenhancers to those who fall victim to their new nature. Technology is not going anywhere. Therefore, we are going to have to adapt to its changes, rather than try to cling to the old ways. As Anderson so aptly puts it, “the virtual horse has already left the digital barn” (504). Once again technology has caused our society to change, and we have to come to terms with how this will impact and change our lives.
Anderson, Sam. “In Defense of Distraction.” Writing Analytically with Readings. David Rosenwasser and Jill Stephen. 2nd ed. Boston: Wadsworth Cengage Learning, 2012. 503-513. Print.
Kleinman, Lisa. “Physically Present, Mentally Absent: Technology Use in Face-to-Face Meetings.” Association for Computing Machinery 2007: 2501-2506. ACM Digital Library. PDF. 14 Sept. 2013.
Richtel, Matt. “Growing Up Digital, Wired for Distraction.” New York Times 21 Nov. 2010. Web. 14 Sept. 2013.
Valkenburg, Jim. “Attention, Reflection and Distraction: The Impact of Technology on Learning.” Association for the Tutoring Profession. n.d. PDF. 17 Sept 2013.