Home
About

How is digital technology changing the way kids' brains learn?

Introduction:

The average American kid between 8 and 18-years-old spends eight-and-a-half hours a day on a computer, listening to an iPod, watching TV, or paying attention to some form of digital technology. To put that another way, over half of an American child's waking hours are spent plugged-in. To YouTube. To Facebook. To their cell phones, you name it. As they get older, they begin to spend even more time online.

People who tend to use the Internet the most are young people who are currently 15 to 25 years old -- those are the folks who grew up with it:internetusebyage.gif

All that digital stimulation is actually changing the way kids' brains biologically evolve -- and that, in turn, is changing the way kids are developing social skills, acquiring language and learning to read.

While neuroscientists and behavioral psychologists don't know exactly what the implications of those changes will be, predictions are wide-ranging. Some say the effects will be minimal, as adult brains have shown they adapt (or retroactively evolve, to put it another way) to the demands of digital media. Others, like UCLA neuroscientist Gary Small, point to a chasm -- a "Brain Gap" -- between kids and adults, whose brains function, socialize and interact in different ways.

1. Modern Kids' Brains Are Actually Evolving Differently Than Other Generations' Brains.

Some research shows that modern kids' brains -- brains that have been steeped in digital media since infancy -- are actually evolving differently than other brains: “As young, malleable brains develop shortcuts to access information, these shortcuts represent new neural pathways being laid down...” ibrain.jpg - Neuroscientist Gary Small, “iBrain : Surviving the Technological Alternation of the Modern Mind.”

Here's the gist: when a child is born, he comes equipped with a brain that has already been hardwired by genetics, innate chemicals and his experience inside the womb. (Here's a neat video in which neuroscientists learn about that "hardwiring" in cats).

The kid then spends the next two years growing his brain: his neurons sprout dendrites and start sending and receiving signals with other parts of the brain at an alarming rate. By the time he’s 2-years-old, the number of neural connectors inside his brain essentially maxes out. He then spends the next decade pruning them down, depending on what he uses, how he uses them, and what he doesn’t use at all.

The process of “learning” in a young brain therefore doesn’t mean forming new neural pathways so much as it means figuring out what neural pathways are not needed, and then getting rid of them to make the brain more efficient. By the time we’re teenagers, we’ve whittled down our synapses to about 60 percent of what we had when we were toddlers.

2. How Are Modern Kids Socializing And Acquiring Language Differently Than Other Generations?

A 2002 Stanford University study indicated that for every hour we spend in front of our computers, we lose a half-hour of traditional face-to-face time – and that’s particularly important for young children who spend the first few years of their lives learning to identify and interpret subtle gestures, like a shoulder shrug or a slightly raised eyebrow.

Some studies indicate that kids are actually not forming important social skills they need.
Socialization in a digital world is not "compatible" with socialization in the old-fashioned world.

“The brains of the younger generation are digitally hardwired from toddlerhood, often at the expense of neural circuitry that controls one-on-one people skills,” writes neuroscientist Gary Small of the University of California Los Angeles.

Social skills are also integrally tied-up in how kids learn language. (Click to see a page about emoticons)

3. How Are Modern Kids' Learning To Read?

At this point, the camps are divided. Some teachers, administrators and educational policymakers say that literacy is a fundamentally non-digital process. Language acquisition must be social and involve human-to-human interaction, they say.

That's the opinion of journalist Todd Oppenheimer, author of “The Flickering Mind: Saving Education from the False Promise of Technology”


flickeringmind.jpg
Todd Oppenheimer researched the role of technology in classrooms from 1997 to 2003 and concluded that “the early and excessive concern about computer literacy [comes] too often at the cost of basic literacy." While Oppenheimer credits some computer programs for being able to reach children with unique learning abilities, he writes that learning from computers “actually damages students, turning out a generation of kids with inferior learning and thinking skills.” Here's an interesting review of Oppenheimer's work.

Other teachers, administrators and policymakers insist that's a luddite's approach. Why not use interactive computer games to teach literacy? At the very least, packaging phonics in a computer game will keep kids interested, right?

Two U.S. Department of Education-sponsored websites have begun rounding up comprehensive studies on some of the educational software currently available in schools nationwide. Check them out at What Works Clearing House and Doing What Works.

The debate over whether or not digital technology and educational software helps kids learn to read is far from settled. Part of the problem is that it's tied up in one of the most heated debates over literacy between teachers: should kids be taught "whole language" or "phonetics"?

And part of the problem is that kids from different socioeconomic backgrounds benefit differently from digital technology in the classroom. To oversimplify, kids from richer families tend to be exposed to more language -- both spoken and read -- by the time they enter kindergarten, compared to kids from poorer families. In fact, some research shows that poorer kids kids hear roughly 30 million words less before their fifth birthday than their richer counterparts.


That's what's called the “30 Million Word Gap” and it directly correlates with a child’s ability to read and comprehend stories read aloud to him, once he arrives in kindergarten. Kids from richer families sometimes benefit more from computer-based learning techniques than their poorer counterparts. Silvia A. Bunge of Helen Wills Neuroscience Institute at the University of California Berkeley recently found that children that are exposed to a higher level of vocabulary at home actually have a higher aptitude for learning.

5. How Are Modern Schools Affecting The Way Kids Learn?

Modern schools are high pressure zones. (Click for more info on multitasking and the negative impacts)

It's not only the amount of time that kids spend "plugged-in" to digital technology that's affecting the way they think. It's the entire culture of the modern, multitasking world that's in play.

Many children – even as young as four and five-years-old – now go to school for 8 hours a day. The amount of “instructional time” in the classroom has increased exponentially over the years, often meaning that kids get less time to play .

recess.gif


And even in the classroom, The modern American kid is always multi-tasking which some say has profoundly negative implications on learning.

6. So, What's It All Mean?

Or, in other words, should we encourage putting more technology in schools? It's not a cut-and-dry question, unfortunately.

On the one hand, computer literacy is a primary skill that will help students become functioning members of society. On the other hand, the mental demands made on brains in the digital world is stressful and -- quite literally "mind-expanding" -- but it doesn't help us think deeply or critically about issues. We're getting good at processing data very quickly, Levy writes, but we're not spending the time thinking creatively.

If modern young people are only “learning” via the short-term, habit-forming parts of their brains, what are they missing? What thoughts are they not having? And if new technologies – mobile internet browsers and email access, digital podcasts, to name a few – continue to ensure that we’re never “wasting time,” are we as a society, young and old both, also accidentally ensuring that we no longer process creative thoughts? And what are the long-term effects of re-wiring our brains in that manner?

And then there's the question of money. American public schools spent $70 billion on computer technology in the '90s alone. The Federal "e-rate" program spent $2.25 billion each year into Internet networks for poor schools. The use of computer-based curricula is rapidly increasing, and -- according to journalist Todd Oppenheimer -- hardware manufacturers continue to court school district business as assiduously as they do the Fortune 500. (This education blogger is also very upset about it)

The bottom-line is that digital media are not a panacea. If we're going to pay to use it, let's make sure it helps our kids learn to read, write, speak, interact and think better. Computers ought to be used in the classroom only as a means to encourage in-depth inquiry, not as a substitute for instruction. By providing students with the tools to actively engage in – rather than be passive recipients of – new media, teachers can empower them, to paraphrase Elizabeth Thoman of the Center for Media Literacy in Los Angeles at the 2005 National Media Education Conference.

<< Back to Cognitive Issues Main Page <<