david_gelernter's picture
Computer Scientist, Yale University; Chief Scientist, Mirror Worlds Technologies; Author, America-Lite: How Imperial Academia Dismantled our Culture (and ushered in the Obamacrats)
Users Are Not Reactionary After All

What I've changed my mind about is that the public is wedded to obsolete 1970s GUIs & info mgmt forever — PARC's desktop & Bell Labs' Unix file system. I'll give two example from my own experience. Both constitute long term ideas of mine and might seem like self-promotion, but my point is that as a society we don't have the patience to develop fully those big ideas that need time to soak in.

I first described a GUI called "lifestreams" in the Washington Post in 1994. By the early 2000s, I thought this system was dead in the water, destined to be resurrected in a grad student's footnote around the 29th century, The problem was (I thought) that Lifestreams was too unfamiliar, insufficiently "evolutionary" and too "revolutionary" (as the good folks at ARPA like to say [or something like that]); you need to go step-by-step with the public and the industry or you lose.

But today "lifestreams" are all over the net (take a look yourself), and I'm told that "lifestreaming" has turned into a verb at some recent Internet conferences. According to ZDnet.com, "Basically what's important about the OLPC [one laptop per child], has nothing to do with its nominal purposes and everything to do with its interface. Ultimately traceable to David Gelernter's 'Lifestreams' model, this is not just a remake of Apple's evolution of the original work at Palo Alto, but something new."

Moral: the public may be cautions but is not reactionary.

In a 1991 book called Mirror Worlds, I predicted that everyone would be putting his personal stuff in the Cybersphere (AKA "the clouds"); I said the same in a 2000 manifesto on Edge called "The 2nd Coming", & in various other pieces in between. By 2005 or so, I assumed that once again I'd jumped the gun, by too long to learn the results pre-posthumously — but once again this (of all topics) turns out to hot and all over the place nowadays. "Cloud computing" is the next big thing: What does this all prove? If you're patient, good ideas find audiences. But you have to bevery patient.

And if you expect to cash in on long-term ideas in the United States, you're certifiable.

This last point is a lesson I teach my students, and on this item I haven't (and don't expect to) change my mind. But what the hell? It's New Year's, and there are worse things than being proved right once in a while, even if it's too late to count.