Our Gadgets Are Changing Our Brains, and Not for the Better

06/08/2010 11:27 AM |

homo habilis
  • The author, checking his email.

This Times profile—of an incredibly gadget-connected, screen-obsessed California family—is frigging scary. So scary, in fact, in its depiction of a family dynamic dominated by screens leading elsewhere, it scared me into pontificating about THE WAY WE LIVE NOW.

Ok, before you roll your eyes and assume this is yet another “technology is bad” screed, I will concede that the ubiquity of technology is irrevocable and, by and large, value-neutral. I will also concede that the brains of the first wholly internet-immersed generation will be different from ours (by “ours” I’m speaking of people born before 1987), and that it is impossible to tell if that “different” will be better or worse. However, even though I will concede that technology improves our lives in innumerable ways, I will not concede to being happy about what it’s doing to my brain. And here’s where I get scared.

For a long time people thought that the brain basically built itself—its neural pathways, its chemical balances—in the early stages of life, coming up with a finished brain just in time for junior year abroad. Well, it turned out that wasn’t exactly the case, that the brain, in all its gray, spungey glory, was constantly reconfiguring itself based on stimuli, right into late middle age. This was good news for cognitive therapists, alcoholics and just about anybody with half a brain: the adaptive, regenerative properties of the organ meant that if you worked hard you could train your brain.

Which brings me to the “transitional” generation, those of us who grew up off the Internet, off personal gadgets, but who are now heavy users. Our brains built themselves based on fairly specific types of stimuli based in linear focus, the storage of data and rote procedural understanding (I’m referring here to your average middle-school history, chemistry and math classes, circa 1990). Our brains developed based on a particular regimen of mental exercise that we’ve now replaced: facts are no longer to be stored in our heads and the answer to any problem can be crowd-sourced in 30 seconds. We no longer have to remember anything but rather synthesize and juxtapose what we find in the collective memory of the Internet. Now, as I conceded above, this might lead to some incredible intellectual breakthroughs for the generation that never had to sit through endless days of rote memory drills, whose brains will exist comfortably within the cloud and who will have so much more room (and time) to synthesize and innovate.

But that’s not me. My brain grew up in books, in one-thing-at-a-time. Granted, I was a very late adopter (I graduated from college in 2000 and refused to use the Internet to obtain my degree), but obviously now, in my professional life (as a fucking blogger), I’m asking my brain to bounce all over the place in search of tiny little pay-offs, scraps of information that, as the Times article discusses, will provide me with a small, addictive dopamine reward. Honestly, just trying to sit here and write what seems like an incredibly long blog post, I’ve repeatedly checked email, Twitter, RSS feeds, responded to instant messages and even looked at L Mag site analytics.

And I don’t think it’s making me very happy. I feel like a middle-distance runner trying to become a race-car driver late in life. Sure, I understand the argument that the Internet and technology are just another set of tools and that you can’t blame the tools for your own inability to control them, but jesus, we’re talking about the brain rewiring itself based on stimuli. So do you blame your own brain or do you blame the stimuli?

Shit. In absence of any grand conclusion here I will simply say we’re fucked, I have emails to send and this is the kicker from the aforementioned Times piece, from a scientist at Stanford:

The way we become more human is by paying attention to each other, it shows how much you care. We are at an inflection point: A significant fraction of people’s experiences are now fragmented.

8 Comment

  • Jonny – There’s another way to look at all this. I blogged about it yesterday – it’s doubtful that our brains are being “changed.” http://tinyurl.com/2fwz5za

  • @TBilich
    I like your angle in that piece, but I think you’re placing too much emphasis on evolution as opposed to local adaption.

    I’m not talking about hundreds of generations of evolved hardwiring, I’m talking about the kind of short-term, local neural changes that occur over weeks and months… For example, the way that too much stress over time will alter the default neurochemical balance in the brain and will retard short-term recall. The brain (as it has evolved to its present “hardwired” template) is still remarkably flexible, and changes in fairly dramatic ways based on recurring stimuli.

    So as the younger generation uses its brain in a specific way, and is rewarded for using it in that way, what we think of when we think about thinking will become pretty different… I can’t honestly say if it will be “better” or “worse,” but it will be different.

  • “Worse,” since you articulate so well my biggest issue, which is that the kinds of information we process determine the kinds of information we’re capable of processing.

    My attention span has gotten worse since I started blogging regularly, and for all the virtues of reactivity, and associativeness, and multi-input juggling, a brain that can’t stick with a long argument portends largely negative outcomes intellectually, civically, morally…

  • Attention getting headline… but is there any actual evidence that gadgets are changing our brains? And if so, evidence that the changes are genuinely worse?

    You could just as easily argue (with no evidence) that gadgets are increasing the ability to think laterally, and to investigate more deeply into concepts and issues. As compared to having to ask the teacher, whose knowledge may be limited, how to connect A to B … we can now work through the connections ourselves from a virtual infinite information base, and draw our own conclusions. …. …. I don’t necessarily think this is true, but have no reason to think this is any more/less true than your hypothesis (stated as fact) in your headline above.

  • interesting all around here…. yes technology has its benefits and being only 25 i can see the logic behind the arguments. I use the internet for a lot of things but find that it can suck up a lot of my time when im sitting at home. Watching interviews, reading articles (L Magazine ‘my new homepage’ – Just discovered this site two days ago and fucking love it!!!!), and finding new music to listen to. The internet takes away the ‘personal’ aspect of communicating with human beings. I own my own art studio and store front, and sometimes the people who come in make we wonder what is happening to us humans. Is all this new technology taking the place for us humans to communicate? For what I can tell, there is mostly only one line of communication from the television to the viewers brain. In and Out, leaving most numb!!!

    On a side note, I had a terrible sinus infection due to allergies yesterday, and low and behold, I used the internet and found the solution. Snorting salt water up my nostrils. The greatest thing, It WORKED. If I went to the DR, I would have gotten a $100.00 bill and a $25.00 prescription co-pay. Thank You Technology

  • @mark
    Well, as I said, I think this change hasn’t been great for those of us in the really stark, transitional period (moving from linear, pre-digital ways of learning and processing info to multifaceted, simultaneous processing): We’re less happy, more stressed out, etc., but that’s mainly because we (the Transitionals) are kind of like guinea pigs. I think it’s really impossible to say what moral/civic paradigms will emerge with the first fully digital generation (I think it’s a bit facile to equate the length of an argument with its moral depth

  • Jonny,

    It’s not merely a matter of length, I should clarify, it’s more about concentration and appreciation for nuance. As I’ve said before, I marvel at language’s capacity to convey shades of meaning, but feel pretty strongly that “our gadgets” aren’t really conducive to that kind of reading.

    Most new media have made us much smarter, socially–we pick up cues better, I think, and are more capable of reading visual and contextual connotations, and so on. So it’s possible that a new kind of intellectual or moral engagement would grow out of these skillsets. I remain… on the lookout, I guess?

    I also wonder, given the sense in which we’re all the curators of our personal online experiences, and the echo-chambered polarization of our politics, whether the internet is necessarily enlarging our “Us.”

  • Mark,
    These are all fair points I largely agree with… I’m honestly not all that optimistic, but want to avoid the panicked chicken-littling of the generation in the middle of a sea-change. (The internet made me write “chicking-littling” as a compound verb.)