| 
 | Executive Times | ||
|  |  | ||
|  |  | ||
|  | 2005 Book Reviews | ||
| Everything
  Bad Is Good For You by Steven Johnson | |||
|  | Rating: ••• (Recommended) | ||
|  |  | ||
|  | Click on
  title or picture to buy from amazon.com | ||
|  |  | ||
|  |  | ||
|  | Contrary It won’t take
  too many pages for the B.S. detector of most readers to start clicking loudly
  while reading Steven Johnson’s new book, Everything
  Bad Is Good For You. Despite the generality of the title, Johnson’s
  premise focuses more narrowly: things like video games and watching
  television develop our brains and are helping us develop our cognitive
  skills. Before you let the kids watch unlimited television and play video games
  with abandon, read this book, and decide where you agree or disagree with
  Johnson. Depending on your personal experiences with video and computer games
  and television, you’re likely to find plenty of places to both agree and
  disagree. Here’s an excerpt, from part II, pp.
  179-184: Pop culture’s race
  to the top over the past decades forces us to rethink our assumptions about
  the base tendencies of mass society: the Brave New World scenario,
  where we’re fed a series of stupefying narcotics by media conglomerates
  interested solely in their lavish profits with no concern for the mental
  improvement of their consumers. As we’ve seen, the Sleeper Curve isn’t the
  result of media titans doing charitable work; there’s an economic incentive
  in producing more challenging culture, thanks to the technologies of
  repetition and meta-commentary. But the end result is the same: left to its
  own devices, following its own profit motives, the media ecosystem has been
  churning out popular culture that has grown steadily more complex over time.
  Imagine a version of Brave New World where soma and the feelies make you smarter, and you get the
  idea. If the Sleeper Curve turns
  the conventional wisdom about mass culture on its head, it does something
  comparable to our own heads—and the truisms we like to spread about them.
  Almost every Chicken Little story about the declining standards of pop
  culture contains a buried blame-the-victim message: Junk culture thrives
  because people are naturally drawn to simple, childish pleasures. Children
  zone out in front of their TV shows or their video games because the mind
  seeks out mindlessness. This is the Slacker theory of brain function: the
  human brain desires above all else that the external world
  refrain from making it do too much work. Given their druthers, our
  brains would prefer to luxuriate among idle fantasies and mild amusements.
  And so, never being one to refuse a base appetite, the culture industry
  obliges. The result is a society where maturity, in Andrew Solomon’s words,
  is a “process of mental atrophy.” These are common enough
  sentiments, but they contain a bizarre set of assumptions if you think about
  them from a distance. Set aside for the time being the historical question
  of why IQs are climbing at an accelerating rate while half the population
  wastes away in mental atrophy. Start instead with the more basic question of
  why our brains would actively seek out atrophy in the first place. The Brave New World critics
  like to talk a big game about the evils of media conglomerates, but their
  world-view also contains a strikingly pessimistic vision of the human mind. I
  think that dark assumption about our innate cravings for junk culture has it
  exactly backward. We know from neuroscience that the brain has dedicated
  systems that respond to—and seek out—new challenges and experiences. We are a
  problem-solving species, and when we confront situations where information
  needs to be filled in, or where a puzzle needs to be untangled, our minds
  compulsively ruminate on the problem until we’ve figured it out. When we
  encounter novel circumstances, when our environment changes in a surprising
  way, our brains lock in on the change and try to put it in context or
  decipher its underlying logic. Parents can sometimes be
  appalled at the hypnotic effect that television has on toddlers; they see
  their otherwise vibrant and active children gazing silently, mouth agape at
  the screen, and they assume the worst: the television is turning their child
  into a zombie. The same feeling arrives a few years later, when they see
  their grade-schoolers navigating through a video game world, oblivious to the
  reality that surrounds them. But these expressions are not signs of mental
  atrophy. They’re signs of focus. The toddler’s brain is constantly
  scouring the world for novel stimuli, precisely because exploring and
  understanding new things and experiences is what learning is all about. In a
  house where most of the objects haven’t moved since yesterday,
  and no new people have appeared on the scene, the puppet show on the television
  screen is the most surprising thing in the child’s environment, the stimuli
  most in need of scrutiny and explanation. And so the child locks in. If you
  suddenly plunked down a real puppet show in the middle of the living room, no
  doubt the child would prefer to make sense of that. But in most ordinary
  household environments, the stimuli onscreen offer the most diversity and
  surprise. The child’s brain locks into those images for good reason. Think about it this way:
  if our brain really desired to atrophy in front of mindless entertainment,
  then the story of the last thirty years of video games—from Pong to The
  Sims—would be a story of games
  that grew increasingly simple over time. You’d never need a guidebook or a
  walk-through; you’d just fly through the world, a demigod untroubled by
  challenge and complexity. Game designers would furiously compete to come out
  with the simplest titles; every virtual space would usher you to the path of
  least resistance. Of course, exactly the opposite has occurred. The games
  have gotten more challenging at an astounding rate: from PacMan’s
  single page of patterns to Grand Theft Auto III’s
  53,000-word walk-through in a mere two decades. The games are growing
  more challenging because there’s an economic incentive to make them more
  challenging—and that economic incentive exists because our brains like to
  be challenged. If our mental appetites
  draw us toward more complexity and not less, why do so many studies show that
  we’re reading fewer books than we used to? Even if we accept the premise that
  television and games can offer genuine cognitive challenges, surely we have
  to admit that books challenge different, but equally important, faculties of
  the mind. And yet we’re drifting away from the printed page at a steady rate.
  Isn’t that a sign of our brains gravitating to lesser forms? I believe the answer is
  no, for two related reasons. First, most studies of reading ignore the huge
  explosion of reading (not to mention writing) that has happened thanks to
  the rise of the Internet. Millions of people spend much of their day staring
  at words on a screen: browsing the Web, reading e-mail, chatting with
  friends, posting a new entry to one of those 8 million blogs.
  E-mail conversations or Web-based analyses of The Apprentice are not
  the same as literary novels, of course, but they are equally text-driven.
  While they suffer from a lack of narrative depth compared to novels, many
  online interactions do have the benefit of being genuinely two-way
  conversations: you’re putting words together yourself, and not just
  digesting someone else’s. Part of the compensation for reading less is the
  fact that we’re writing more. The fact that we are
  spending so much time online gets to the other, more crucial, objection: yes,
  we’re spending less time reading literary fiction, but that’s because we’re
  spending less time doing everything we used to do before. In fact, the
  downward trend that strikes the most fear in the hearts of Madison Avenue and
  their clients is not the decline of literary reading—it’s the decline of
  television watching. The most highly sought demographic in the country—
  twenty-something males—watches almost one-fifth less television than they
  did only five years ago. We’re buying fewer CDs; we’re going out to the
  movies less regularly. We’re doing all these old activities less because
  about a dozen new activities have become bona fide mainstream pursuits in the
  past ten years: the Web, e-mail, games, DVDs, cable on-demand, text chat.
  We’re reading less because there are only so many hours in the day, and we
  have all these new options to digest and explore. If reading were the only
  cultural pursuit to show declining numbers, there might be cause for alarm.
  But that decline is shared by all the old media forms across the board. As
  long as reading books remains part of our cultural diet, and as long
  as the new popular forms continue to offer their own cognitive rewards,
  we’re not likely to descend into a culture of mental atrophy anytime soon. Throughout Everything
  Bad Is Good For You, Johnson throws out something likely to be
  controversial, uses some neuroscience evidence to lean support, and then
  moderates his extreme views a bit. While readers may often reflect along the
  lines of “B.S.” or “that can’t possibly be true,” Johnson will make you think
  about your judgments of what is good or bad for you and others. Steve Hopkins,
  July 25, 2005 | ||
|  |  | ||
| Go to Executive Times
  Archives | |||
|  |  | ||
|  |  | ||
|  | ã 2005 Hopkins and Company, LLC The recommendation rating for
  this book appeared  in the August 2005
  issue of Executive Times URL for this review: http://www.hopkinsandcompany.com/Books/Everything
  Bad Is Good For You.htm For Reprint Permission,
  Contact: Hopkins & Company, LLC •  E-mail: books@hopkinsandcompany.com | ||
|  |  | ||
|  |  | ||