Edge in the News

LA STAMPA [1.19.10]

His name is John Brockman and he is a special type: he queries the best scientists, asks them to pose their most pressing questions, brings them together on his site edge.org to converse about everything and sometimes they also meet in the flesh, in California or in Paris. He often encourages them to write about their visionary ideas, which become best-selling books.

Brockman is much more than a publishing and cultural impresario. He is unique, someone who is not replicable in the universe of science where replicability and falsifiability are notoriously non-negotiable rules.

That is why his posing questions, like a petulant boy genius, to the world's greatest thinkers is disorienting. Guest of the Festival of Science in Rome, he has distributed his provocations there too, ranging from the subject of Alan Turing, the unfortunate and mythical English scientist who founded the digital world which led to his new Question of 2010, which has inspired a group of American thinkers: "How does the Internet change the way you think?" He explains: "I stress the "you" instead of "we". After oscillating between the two, I chose the "you" because Edge is a conversation. The "we" would have suggested a public voice instead, of experts on stage."

"The question," he adds, "is based on the idea of my friend and former collaborator James Lee Byars.

What did Byars do?

Photo of John Brockman
Role literary agent and author of popular science books 
Book: "This Will Change Everything. Ideas That Will Change The Future, HarperCollins
Site:www.edge.org

"In 1971, he went to Harvard Square and there founded the "World Question Center". He was convinced that to reach the edge of knowledge, it was not in fact necessary to read 6 million volumes in the university library: he believed it was enough to collect the most sophisticated minds in one place and lock the door until they asked each other the questions they were asking themselves. This idea is in perfect harmony with the logic of today's digital environment, in which everything can be assembled with the power of algorithms: it is the reality of the Petabyte Revolution.

Explain what this is.

"The accumulation of data is such that instead of starting an argument and then testing it with a series of experiments, the data already stored may be investigated in order to discover what is hidden."

Does this mean that the scientific method, which is sacrosanct, is about to change?

"In any case, yes, there is someone who has already made this hypothesis. I think of Craig Venter, who is deciphering the Genome: he is the greatest advocate in the private sector of a continuing growth in computational power. He collects billions of pieces of genetic information from various sources, including the oceans and processes them using computers: it is a scale of data never dealt with before."

And the responses to the "Question"? What is currently the one that has intrigued the most?

"George Dyson's response: "kayak vs. canoe." It refers to two opposite approaches to achieving the same result, the construction of boats. The first entails the assembly of a skeleton from bits and pieces, and the latter involves carving out of whole trees. The Internet has produced a similar gap: we were manufacturers of kayaks, used to seeking any information that could keep us afloat, and now, instead, we must learn to shape the canoe, removing everything that is not necessary and bringing to light the hidden heart of knowledge. Whoever doesn't acquire the new skills will be forced to row his crudely carved tree trunks."

He never tires of prodding scientists: his latest book, This Will Change Everything, is dedicated to ideas that will shape the future. Are you sure to have discovered the best?

"In another career I acted as an impresario. I was the guy who ran the theatre, standing in the back, turning the lights on and off. This is my role and in this case, the basic concept I present is that new media creates new perceptions: science creates the technology to use and we recreate ourselves in its image. Until recently no one had ever thought of this process. It was unconscious. No one has voted on the printing press, electricity, radio, tv, cars, airplanes. Nobody voted for penicillin, nuclear energy, space travel. No one voted for computers, Internet, email, Google, cloning. Now we move towards a new definition of life and a condition in which science is not only news, but The News. Politicians can play catch up and and chase the developments. James Watson, the man who was co-discoverer of the double helix of DNA and who is currently only only one of two people to have posted his genetic code on the Internet, is said to oppose any interference. The other person, Craig Venter, is preparing to create artificial life: And even now, he can move a drop of genetic material from one dish to another and...your dog can become a cat. The result is that everything will change and therefore, the question for the new book was: What game-changing scientific revolution do you expect to live to see?

You had 151 responses: reveal your favorite.

"Kevin Kelley impressed me: he spoke of a new type of mind, amplified by the Internet, evolving, and able to start a new phase of evolution outside of the body. And so many others... Ed Regis and "molecular manufacturing", about the production of new molecules as one of the frontiers of nanotechnology. William Calvin and our vulnerability to climate and our intellectual capacity to react. Nicholas Humphrey and rebellious impulses of human nature: as we transform ourselves, we nevertheless remain the same, distracted by violence and politics. Freeman Dyson and telepathy, with the possibility of direct communication from one brain to another. And finally, a novelist, Ian McEwan. He confessed to wanting to live long enough to witness the final triumph of solar technology.

The New york Times [1.18.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says. But this discipline will prove no mean feat, since mental focus must take place amid the unlimited distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — “How is the Internet changing the way you think?” — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature “Invisible High School” and more laudable “Invisible College.”

David Dalrymple, a researcher at the Massachusetts Institute of Technology,thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

DESCRIPTION

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally. [Edge via The Daily Dish]

More Recommended Reading:

http://ideas.blogs.nytimes.com/2010/01/19/the-age-of-external-knowledge/ [1.17.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says.

But this discipline will prove no mean feat, since mental focus must take place amid the unlimited 

distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — "How is the Internet changing the way you think?" — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature "Invisible High School" and more laudable "Invisible College." David Dalrymple, a researcher at the Massachusetts Institute of Technology, thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally.

FIELDS: I compute, therefore I am
Washington Times [1.14.10]
Read the full article →

The New york Times [1.13.10]

In 2006, the artist and computer scientist Jaron Lanier published an incisive, groundbreaking and highly controversial essay about “digital Maoism” — about the downside of online collectivism, and the enshrinement by Web 2.0 enthusiasts of the “wisdom of the crowd.” In that manifesto Mr. Lanier argued that design (or ratification) by committee often does not result in the best product, and that the new collectivist ethos — embodied by everything from Wikipedia to“American Idol” to Google searches — diminishes the importance and uniqueness of the individual voice, and that the “hive mind” can easily lead to mob rule.

Jonathan Sprague

Jaron Lanier

 

YOU ARE NOT A GADGET

 

A Manifesto

By Jaron Lanier

209 pages. Alfred A. Knopf. $24.95.

Related

Bits: Can We Change the Web's Culture of Nastiness?

Excerpt: ‘You Are Not a Gadget’(pdf)

Now, in his impassioned new book “You Are Not a Gadget,” Mr. Lanier expands this thesis further, looking at the implications that digital Maoism or “cybernetic totalism” have for our society at large. Although some of his suggestions for addressing these problems wander into technical thickets the lay reader will find difficult to follow, the bulk of the book is lucid, powerful and persuasive. It is necessary reading for anyone interested in how the Web and the software we use every day are reshaping culture and the marketplace.

Mr. Lanier, a pioneer in the development of virtual reality and a Silicon Valley veteran, is hardly a Luddite, as some of his critics have suggested. Rather he is a digital-world insider who wants to make the case for “a new digital humanism” before software engineers’ design decisions, which he says fundamentally shape users’ behavior, become “frozen into place by a process known as lock-in.” Just as decisions about the dimensions of railroad tracks determined the size and velocity of trains for decades to come, he argues, so choices made about software design now may yield “defining, unchangeable rules” for generations to come.

Decisions made in the formative years of computer networking, for instance, promoted online anonymity, and over the years, as millions upon millions of people began using the Web, Mr. Lanier says, anonymity has helped enable the dark side of human nature. Nasty, anonymous attacks on individuals and institutions have flourished, and what Mr. Lanier calls a “culture of sadism” has gone mainstream. In some countries anonymity and mob behavior have resulted in actual witch hunts. “In 2007,” Mr. Lanier reports, “a series of ‘Scarlet Letter’ postings in China incited online throngs to hunt down accused adulterers. In 2008, the focus shifted to Tibet sympathizers.”

Mr. Lanier sensibly notes that the “wisdom of crowds” is a tool that should be used selectively, not glorified for its own sake. Of Wikipedia he writes that “it’s great that we now enjoy a cooperative pop culture concordance” but argues that the site’s ethos ratifies the notion that the individual voice — even the voice of an expert — is eminently dispensable, and “the idea that the collective is closer to the truth.” He complains that Wikipedia suppresses the sound of individual voices, and similarly contends that the rigid format of Facebook turns individuals into “multiple-choice identities.”

Like Andrew Keen in “The Cult of the Amateur,” Mr. Lanier is most eloquent on how intellectual property is threatened by the economics of free Internet content, crowd dynamics and the popularity of aggregator sites. “An impenetrable tone deafness rules Silicon Valley when it comes to the idea of authorship,” he writes, recalling the Wired editor Kevin Kelly’s 2006 prediction that the mass scanning of books would one day create a universal library in which no book would be an island — in effect, one humongous text, made searchable and remixable on the Web.

“It might start to happen in the next decade or so,” Mr. Lanier writes. “Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what’s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don’t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video.”

While this development might sound like a good thing for consumers — so much free stuff! — it makes it difficult for people to discern the source, point of view and spin factor of any particular fragment they happen across on the Web, while at the same time encouraging content producers, in Mr. Lanier’s words, “to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind.” A few lucky people, he notes, can benefit from the configuration of the new system, spinning their lives into “still-novel marketing” narratives, as in the case, say, of Diablo Cody, “who worked as a stripper, can blog and receive enough attention to get a book contract, and then have the opportunity to have her script made into a movie — in this case, the widely acclaimed ‘Juno.’ ” He fears, however, that “the vast majority of journalists, musicians, artists and filmmakers” are “staring into career oblivion because of our failed digital idealism.”

Paradoxically enough, the same old media that is being destroyed by the Net drives an astonishing amount of online chatter. “Comments about TV shows, major movies, commercial music releases, and video games must be responsible for almost as much bit traffic as porn,” Mr. Lanier observes. “There is certainly nothing wrong with that, but since the Web is killing the old media, we face a situation in which culture is effectively eating its own seed stock.”

In other passages in this provocative and sure-to-be-controversial book he goes even further, suggesting that “pop culture has entered into a nostalgic malaise,” that “online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media.”

Online culture, he goes on, “is a culture of reaction without action” and rationalizations that “we were entering a transitional lull before a creative storm” are just that — rationalizations. “The sad truth,” he concludes, “is that we were not passing through a momentary lull before a storm. We had instead entered a persistent somnolence, and I have come to believe that we will only escape it when we kill the hive.”

Pages