As someone who believes both in human nature and in timeless standards of logic and evidence, I'm skeptical of the common claim that the Internet is changing the way we think. Electronic media aren't going to revamp the brain's mechanisms of information processing, nor will they supersede modus ponens or Bayes' theorem. Claims that the Internet is changing human thought are propelled by a number of forces: the pressure on pundits to announce that this or that "changes everything"; a superficial conception of what "thinking" is that conflates content with process; the neophobic mindset that "if young people do something that I don't do, the culture is declining." But I don't think the claims stand up to scrutiny.
Has a generation of texters, surfers, and twitterers evolved the enviable ability to process multiple streams of novel information in parallel? Most cognitive psychologists doubt it, and recent studies by Clifford Nass confirm their skepticism. So-called mutlitaskers are like Woody Allen after he took a speed-reading course and devoured War and Peace in an evening. His summary: "It was about some Russians."
Also widely rumored are the students who cannot write a paper without instant-message abbreviations, emoticons, and dubious Web citations. But students indulge in such laziness to the extent that their teachers let them get away with it. I have never seen a paper of this kind, and a survey of university student papers by Andrea Lunsford shows they are mostly figments of the pundits' imaginations.
The way that intellectual standards constrain intellectual products is no more evident than in science. Scientists are voracious users of the Internet, and of other computer-based technologies that are supposedly making us stupid, like Powerpoint, electronic publishing, and email. Yet it would be ludicrous to suggest that scientists think differently than they did a decade ago, or that the progress of science has slowed.
The most interesting trend in the development of the Internet is not how it is changing people's ways of thinking but how it is adapting to the way that people think. The leap in Internet usage that accompanied the appearance of the World Wide Web more than a decade ago came from its user interface, the graphical browser, which worked around the serial, line-based processing of the actual computer hardware to simulate a familiar visual world of windows, icons, and buttons. The changes we are seeing more recently include even more natural interfaces (speech, language, manual manipulation), better emulation of human expertise (as in movie, book, or music recommendations, and more intelligent search), and the application of Web technologies to social and emotional purposes (such as social networking, sharing of pictures, music, and video) rather than just the traditional nerdy ones.
To be sure, many aspects of the life of the mind have been affected by the Internet. Our physical folders, mailboxes, bookshelves, spreadsheets, documents, media players, and so on have been replaced by software equivalents, which has altered our time budgets in countless ways. But to call it an alternation of "how we think" is, I think, an exaggeration.