Edge in the News: 2010

The New york Times [1.18.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says. But this discipline will prove no mean feat, since mental focus must take place amid the unlimited distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — “How is the Internet changing the way you think?” — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature “Invisible High School” and more laudable “Invisible College.”

David Dalrymple, a researcher at the Massachusetts Institute of Technology,thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

DESCRIPTION

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally. [Edge via The Daily Dish]

More Recommended Reading:

http://ideas.blogs.nytimes.com/2010/01/19/the-age-of-external-knowledge/ [1.17.10]

Today’s idea: Filtering, not remembering, is the most important mental skill in the digital age, an essay says.

But this discipline will prove no mean feat, since mental focus must take place amid the unlimited 

distractions of the Internet.

Internet | Edge, the high-minded ideas and tech site, has posed its annual question for 2010 — "How is the Internet changing the way you think?" — and gotten some interesting responses from a slew of smart people. They range from the technology analyst Nicholas Carr, who wonders if the Web made it impossible for us to read long pieces of writing; to Clay Shirky, social software guru, who sees the Web poised uncertainly between immature "Invisible High School" and more laudable "Invisible College." David Dalrymple, a researcher at the Massachusetts Institute of Technology, thinks human memory will no longer be the key repository of knowledge, and focus will supersede erudition. Quote:

Before the Internet, most professional occupations required a large body of knowledge, accumulated over years or even decades of experience. But now, anyone with good critical thinking skills and the ability to focus on the important information can retrieve it on demand from the Internet, rather than her own memory. On the other hand, those with wandering minds, who might once have been able to focus by isolating themselves with their work, now often cannot work without the Internet, which simultaneously furnishes a panoply of unrelated information — whether about their friends’ doings, celebrity news, limericks, or millions of other sources of distraction. The bottom line is that how well an employee can focus might now be more important than how knowledgeable he is. Knowledge was once an internal property of a person, and focus on the task at hand could be imposed externally, but with the Internet, knowledge can be supplied externally, but focus must be forced internally.

FIELDS: I compute, therefore I am
Washington Times [1.14.10]
Read the full article →

ATLANTIC WIRE [1.13.10]

Edge is an organization of deep, visionary thinkers on science and culture. Each year the group poses a question, this year collecting 168 essay responses to the question, "How is the Internet changing the way you think?" 

In answer, academics, scientists and philosophers responded with musings on the Internetenabling telecommunication, or functioning as a sort of prosthesis, or robbing us of our old, linear" mode of thinking. Actor Alan Alda described the Web as "speed plus mobs." Responses alternate between the quirky and the profound ("In this future, knowledge will be fully outside the individual, focus will be fully inside, and everybody's selves will truly be spread everywhere.") 

Since it takes a while to read the entire collection--and the Atlantic Wire should know, as we tried--here are some of the more piquant answers. Visit the Edge website for the full experience. For a smart, funny answer in video form, see here.

  • We Haven't Changed, declares Harvard physician and sociologist Nicholas Christakis. Our brains "likely evolved ... in response to the demands of social (rather than environmental) complexity," and would likely only continue to evolve as our social framework changes. Our social framework has not changed: from our family units to our military units, he points out, our social structures remain fairly similar to what they were over 1000 years ago. "The Internet itself is not changing the fundamental reality of my thinking any more than it is changing our fundamental proclivity to violence or our innate capacity for love."
  • Bordering on Mental Illness Barry C. Smith of the University of London writes of the new importance of "well-packaged information." He says he is personally "exhilarated by the dizzying effort to make connections and integrate information. Learning is faster. Though the tendency to forge connecting themes can feel dangerously close to the search for patterns that overtakes the mentally ill."
  • New 'Survival of the Focused' Stanford psychologist Brian Knutson thinks the Internet may bias us towards our "present" selves rather than "future" selves, leading to procrastination: "I worry that the Internet may impose a 'survival of the focused,' in which individuals gifted with some natural capacity to stay on target or who are hopped up on enough stimulants forge ahead, while the rest of us flail helplessly in some web-based attentional vortex."
  • Language is a Technology, Too, points out another Stanford psychologist, Lera Boroditsky. Some technologies "we no longer even notice as technologies: they just seem like natural extensions of our minds. Numbers are one such example: a human-invented tool that once learned has incredible productive power in the mind. Writing is another such example. It no longer seems magical in the literate world that one could communicate a complex set of thoughts silently across vast reaches of time and space using only a cocktail napkin and some strategically applied stains." Boroditsky ends with a jab at renowned philosopher Dan Dennett, who makes his own point about how "absolute power corrupts absolutely," and the Internet is absolute.
  • We Are Immortal, is Juan Enriquez's startling conclusion. "Future sociologists and archaeologists," unlike current ones studying ancient Rome, "will have access to excruciatingly detailed pictures on an individual basis." There are drawbacks: "those of a certain age learned long ago, from the triumphs and tragedies of Greek Gods, that there are clear rules separating the mortal and immortal. Trespasses tolerated and forgiven in the fallible human have drastic consequences for Gods. In the immortal world all is not forgiven and mostly forgotten after you shuffle off to Heaven."
  • Cells are to Humans as Humans are to Internet Humanity W. Tecumseh Fitch, cognitive biologist at the University of Vienna, looks at the way single cells gradually grouped into multi-celled organisms that required organization, with certain cells exerting control over others through hormones and neurons. Humans are now "the metaphoric neurons or the global brain," he says, with HTML for neurotransmitters as we rush to "the brink of a wholly new system of societal organization." He sees "two main problems," though, with his metaphor:

First, the current global brain is only tenuously linked to the organs of international power ... Second, our nervous systems evolved over 400 million years of natural selection, during which billions of competing false-starts and miswired individuals were ruthlessly weeded out. But there is only one global brain today, and no trial and error process to extract a functional configuration from the trillions of possible configurations. This formidable design task is left up to us.

The New york Times [1.13.10]

In 2006, the artist and computer scientist Jaron Lanier published an incisive, groundbreaking and highly controversial essay about “digital Maoism” — about the downside of online collectivism, and the enshrinement by Web 2.0 enthusiasts of the “wisdom of the crowd.” In that manifesto Mr. Lanier argued that design (or ratification) by committee often does not result in the best product, and that the new collectivist ethos — embodied by everything from Wikipedia to“American Idol” to Google searches — diminishes the importance and uniqueness of the individual voice, and that the “hive mind” can easily lead to mob rule.

Jonathan Sprague

Jaron Lanier

 

YOU ARE NOT A GADGET

 

A Manifesto

By Jaron Lanier

209 pages. Alfred A. Knopf. $24.95.

Related

Bits: Can We Change the Web's Culture of Nastiness?

Excerpt: ‘You Are Not a Gadget’(pdf)

Now, in his impassioned new book “You Are Not a Gadget,” Mr. Lanier expands this thesis further, looking at the implications that digital Maoism or “cybernetic totalism” have for our society at large. Although some of his suggestions for addressing these problems wander into technical thickets the lay reader will find difficult to follow, the bulk of the book is lucid, powerful and persuasive. It is necessary reading for anyone interested in how the Web and the software we use every day are reshaping culture and the marketplace.

Mr. Lanier, a pioneer in the development of virtual reality and a Silicon Valley veteran, is hardly a Luddite, as some of his critics have suggested. Rather he is a digital-world insider who wants to make the case for “a new digital humanism” before software engineers’ design decisions, which he says fundamentally shape users’ behavior, become “frozen into place by a process known as lock-in.” Just as decisions about the dimensions of railroad tracks determined the size and velocity of trains for decades to come, he argues, so choices made about software design now may yield “defining, unchangeable rules” for generations to come.

Decisions made in the formative years of computer networking, for instance, promoted online anonymity, and over the years, as millions upon millions of people began using the Web, Mr. Lanier says, anonymity has helped enable the dark side of human nature. Nasty, anonymous attacks on individuals and institutions have flourished, and what Mr. Lanier calls a “culture of sadism” has gone mainstream. In some countries anonymity and mob behavior have resulted in actual witch hunts. “In 2007,” Mr. Lanier reports, “a series of ‘Scarlet Letter’ postings in China incited online throngs to hunt down accused adulterers. In 2008, the focus shifted to Tibet sympathizers.”

Mr. Lanier sensibly notes that the “wisdom of crowds” is a tool that should be used selectively, not glorified for its own sake. Of Wikipedia he writes that “it’s great that we now enjoy a cooperative pop culture concordance” but argues that the site’s ethos ratifies the notion that the individual voice — even the voice of an expert — is eminently dispensable, and “the idea that the collective is closer to the truth.” He complains that Wikipedia suppresses the sound of individual voices, and similarly contends that the rigid format of Facebook turns individuals into “multiple-choice identities.”

Like Andrew Keen in “The Cult of the Amateur,” Mr. Lanier is most eloquent on how intellectual property is threatened by the economics of free Internet content, crowd dynamics and the popularity of aggregator sites. “An impenetrable tone deafness rules Silicon Valley when it comes to the idea of authorship,” he writes, recalling the Wired editor Kevin Kelly’s 2006 prediction that the mass scanning of books would one day create a universal library in which no book would be an island — in effect, one humongous text, made searchable and remixable on the Web.

“It might start to happen in the next decade or so,” Mr. Lanier writes. “Google and other companies are scanning library books into the cloud in a massive Manhattan Project of cultural digitization. What happens next is what’s important. If the books in the cloud are accessed via user interfaces that encourage mashups of fragments that obscure the context and authorship of each fragment, there will be only one book. This is what happens today with a lot of content; often you don’t know where a quoted fragment from a news story came from, who wrote a comment, or who shot a video.”

While this development might sound like a good thing for consumers — so much free stuff! — it makes it difficult for people to discern the source, point of view and spin factor of any particular fragment they happen across on the Web, while at the same time encouraging content producers, in Mr. Lanier’s words, “to treat the fruits of their intellects and imaginations as fragments to be given without pay to the hive mind.” A few lucky people, he notes, can benefit from the configuration of the new system, spinning their lives into “still-novel marketing” narratives, as in the case, say, of Diablo Cody, “who worked as a stripper, can blog and receive enough attention to get a book contract, and then have the opportunity to have her script made into a movie — in this case, the widely acclaimed ‘Juno.’ ” He fears, however, that “the vast majority of journalists, musicians, artists and filmmakers” are “staring into career oblivion because of our failed digital idealism.”

Paradoxically enough, the same old media that is being destroyed by the Net drives an astonishing amount of online chatter. “Comments about TV shows, major movies, commercial music releases, and video games must be responsible for almost as much bit traffic as porn,” Mr. Lanier observes. “There is certainly nothing wrong with that, but since the Web is killing the old media, we face a situation in which culture is effectively eating its own seed stock.”

In other passages in this provocative and sure-to-be-controversial book he goes even further, suggesting that “pop culture has entered into a nostalgic malaise,” that “online culture is dominated by trivial mashups of the culture that existed before the onset of mashups, and by fandom responding to the dwindling outposts of centralized mass media.”

Online culture, he goes on, “is a culture of reaction without action” and rationalizations that “we were entering a transitional lull before a creative storm” are just that — rationalizations. “The sad truth,” he concludes, “is that we were not passing through a momentary lull before a storm. We had instead entered a persistent somnolence, and I have come to believe that we will only escape it when we kill the hive.”

ATLANTIC WIRE [1.12.10]

Edge is an organization of deep, visionary thinkers on science and culture. Each year the group poses a question, this year collecting 168 essay responses to the question, "How is the Internet changing the way you think?"

In answer, academics, scientists and philosophers responded with musings on the Internet enabling telecommunication, or functioning as a sort of prosthesis, or robbing us of our old, linear" mode of thinking. ActorAlan Alda described the Web as "speed plus mobs." Responses alternate between the quirky and the profound ("In this future, knowledge will be fully outside the individual, focus will be fully inside, and everybody's selves will truly be spread everywhere.")

Since it takes a while to read the entire collection--and the Atlantic Wire should know, as we tried--here are some of the more piquant answers. Visit the Edge website for the full experience. For a smart, funny answer in video form, see here.

  • We Haven't Changed, declares Harvard physician and sociologist Nicholas Christakis. Our brains "likely evolved ... in response to the demands of social (rather than environmental) complexity," and would likely only continue to evolve as our social framework changes. Our social framework has not changed: from our family units to our military units, he points out, our social structures remain fairly similar to what they were over 1000 years ago. "The Internet itself is not changing the fundamental reality of my thinking any more than it is changing our fundamental proclivity to violence or our innate capacity for love."

  • Bordering on Mental Illness Barry C. Smith of the University of London writes of the new importance of "well-packaged information." He says he is personally "exhilarated by the dizzying effort to make connections and integrate information. Learning is faster. Though the tendency to forge connecting themes can feel dangerously close to the search for patterns that overtakes the mentally ill."

boingboing [1.11.10]

edge.jpg

 

Each year, John Brockman of Edge.org asks a question of a number of science, tech, and media personalities, and compiles the answers. This year's question: "How is the internet changing the way you think?" Lots of good, meaty responses that make for great reading, from interesting people whose work ideas have been blogged here on Boing Boing before: Kevin Kelly, Jaron Lanier, Linda Stone, George Dyson, Danny Hillis, Esther Dyson, Tim O'Reilly, Doug Rushkoff, Jesse Dylan, Richard Dawkins, Alan Alda, Brian Eno, and many more.

I'm far out-classed by the aforementioned thinkers. But here's a snip from my more modest contribution, "I DON'T TRUST ALGORITHM LIKE I TRUST INTUITION":

I travel regularly to places with bad connectivity. Small villages, marginalized communities, indigenous land in remote spots around the globe. Even when it costs me dearly, on a spendy satphone or in gold-plated roaming charges, my search-itch, my tweet twitch, my email toggle, those acquired instincts now persist.

The impulse to grab my iPhone or pivot to the laptop, is now automatic when I'm in a corner my own wetware can't get me out of. The instinct to reach online is so familiar now, I can't remember the daily routine of creative churn without it. The constant connectivity I enjoy back home means never reaching a dead end. There are no unknowable answers, no stupid questions. The most intimate or not-quite-formed thought is always seconds away from acknowledgement by the great "out there."

The shared mind that is the Internet is a comfort to me. I feel it most strongly when I'm in those far-away places, tweeting about tortillas or volcanoes or voudun kings, but only because in those places, so little else is familiar. But the comfort of connectivity is an important part of my life when I'm back on more familiar ground, and take it for granted.

Macht das Internet nun schlau oder dumm? Von Alan Posener
Die Welt [1.11.10]

 

[continue...]

 

Read the full article →

THE HUFFINGTON POST [1.10.10]

"Love Intermedia Kinetic Environments." John Brockman speaking -- partly kidding, but conveying the notion that Intermedia Kinetic Environments are In in the places where the action is -- an Experience, an Event, an Environment, a humming electric world." -- The New York Times

On a Sunday in September 1966, I was sitting on a park bench reading about myself on the front page of the New York Times Arts & Leisure section. I was wondering whether the article would get me fired from my job at the New York Film Festival at Lincoln Center, where I was producing "expanded cinema" and "intermedia" events. I was twenty-five years old.

New and exciting ideas and forms of expression were in the air. They came out of happenings, the dance world, underground movies, avant-garde theater. They came from artists engaged in experiment. Intermedia consisted more often than not of unscripted, sometimes spontaneous theatrical events in which the audience was also a participant. I was lucky enough to have some small part in this upheaval, having been hired a year earlier by the underground filmmaker and critic Jonas Mekas to manage the Filmmakers' Cinémathèque and organize and run the Expanded Cinema Festival.

During that wildly interesting period, many of the leading artists were reading science and bringing scientific ideas to their work. John Cage gave me a copy of Norbert Wiener's Cybernetics; Bob Rauschenberg turned me on to James Jeans' The Mysterious Universe. Claes Oldenburg suggested I read George Gamow's 1,2,3...Infinity. USCO, a group of artists, engineers, and poets who created intermedia environments; La Monte Young's Theatre of Eternal Music; Andy Warhol's Factory; Nam June Paik's video performances; Terry Riley's minimalist music -- these were master classes in the radical epistemology of a set of ideas involving feedback and information.

Another stroke of good luck was my inclusion in a small group of young artists invited by Fluxus artist Dick Higgins to attend a series of dinners with John Cage -- an ongoing seminar about media, communications, art, music, and philosophy that focused on the ideas of Norbert Wiener, Claude Shannon, and Marshall McLuhan. Cage was aware of research conducted in the late 1930s and 1940s by Wiener, Shannon, Vannevar Bush, Warren McCulloch, and John von Neumann, who were all present at the creation of cybernetic theory. And he had picked up on McLuhan's idea that by inventing electric technology we had externalized our central nervous systems -- that is, our minds -- and that we now had to presume that "There's only one mind, the one we all share." We had to go beyond personal mind-sets: "Mind" had become socialized. "We can't change our minds without changing the world," Cage said. Mind as a man-made extension had become our environment, which he characterized as a "collective consciousness" that we could tap into by creating "a global utilities network."

Back then, of course, the Internet didn't exist, but the idea was alive. In 1962, J.C.R Licklider, who had published "Man-Computer Symbiosis" in 1960 and described the idea of an "Intergalactic Computer Network" in 1961, was hired as the first director of the new Information Processing Techniques Office (IPTO) at the Pentagon's Advanced Research Projects Agency, an agency created as a response to Sputnik. Licklider designed the foundation for a global computer network. He and his successors at IPTO, including Robert Taylor and Larry Roberts, provided the ideas that led to the development of the ARPAnet, the forerunner of the Internet, which itself emerged as an ARPA-funded research project in the mid-1980s.

Inspired also by architect-designer Buckminster Fuller, futurist John McHale, and cultural anthropologists Edward T. ("Ned") Hall and Edmund Carpenter, I began to read avidly in the field of information theory, cybernetics, and systems theory. McLuhan himself introduced me to The Mathematical Theory of Communication by Shannon and Weaver, which began: "The wordcommunication will be used here in a very broad sense to include all of the procedures by which one mind may affect another. This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior."

Inherent in these ideas is a radical new epistemology. It tears apart the fabric of our habitual thinking. Subject and object fuse. The individual self decreates. I wrote a synthesis of these ideas in my first book, By the Late John Brockman (1969), taking information theory -- the mathematical theory of communications -- as a model for regarding all human experience. I began to develop a theme that has informed my endeavors ever since: New technologies beget new perceptions. Reality is a man-made process. Our images of our world and of ourselves are, in part, models resulting from our perceptions of the technologies we generate.

We create tools and then we mold ourselves in their image. Seventeenth-century clockworks inspired mechanistic metaphors ("The heart is a pump"), just as the self-regulating engineering devices of the mid-twentieth century inspired the cybernetic image ("The brain is a computer"). The anthropologist Gregory Bateson has characterized the post-Newtonian worldview as one of pattern, of order, of resonances in which the individual mind is a subsystem of a larger order. Mind is intrinsic to the messages carried by the pathways within the larger system and intrinsic also in the pathways themselves.

Ned Hall once pointed out to me that the most critical inventions are not those that resemble inventions but those that appear innate and natural. Once you become aware of this kind of invention, it is as though you had always known about it. ("The medium is the message." Of course, I always knew that).

Hall's candidate for the most important invention was not the capture of fire, the printing press, the discovery of electricity, or the discovery of the structure of DNA. The most important invention was ... talking. To illustrate the point, he told a story about a group of prehistoric cavemen having a conversation.

"Guess what?" the first man said. "We're talking." Silence. The others looked at him with suspicion.

"What's 'talking'?" a second man asked.

"It's what we're all doing, right now. We're talking!"

"You're crazy," the third man said. "I never heard of such a thing!"

"I'm not crazy," the first man said. "You're crazy. We're talking."

Talking, undoubtedly, was considered innate and natural until the first man rendered it visible by exclaiming, "We're talking."


* * *

A new invention has emerged, a code for the collective conscious, which requires a new way of thinking. The collective externalized mind is the mind we all share. The Internet is the infinite oscillation of our collective conscious interacting with itself. It's not about computers. It's not about what it means to be human -- in fact it challenges, renders trite, our cherished assumptions on that score. It's about thinking. "We're talking."

This year's Question is "How is the Internet changing the way YOU think?" Not "How is the Internet changing the way WE think?" We spent a lot of time going back on forth on "YOU" vs. "WE" and came to the conclusion to go with "YOU", the reason being that Edge is a conversation. "WE" responses tend to come across like expert papers, public pronouncements, or talks delivered from stage.

We wanted people to think about the "Internet", which includes, but is a much bigger subject than the Web, an application on the Internet, or search, browsing, etc., which are apps on the Web. Back in 1996, computer scientist and visionary Danny Hillis pointed out that when it comes to the Internet, "Many people sense this, but don't want to think about it because the change is too profound. Today, on the Internet the main event is the Web. A lot of people think that the Web is the Internet, and they're missing something. The Internet is a brand-new fertile ground where things can grow, and the Web is the first thing that grew there. But the stuff growing there is in a very primitive form. The Web is the old media incorporated into the new medium. It both adds something to the Internet and takes something away."

This year, I enlisted the aid of Hans Ulrich Obrist, Curator of the Serpentine Gallery in London, as well as the artist April Gornik, one of the early members of "The Reality Club" (the precursor to the online Edge) to help broaden the Edge conversation -- or rather to bring it back to where it was in the late 80s/early 90s, when April gave a talk at a "Reality Club" meeting, and discussed the influence of chaos theory on her work, and when Benoit Mandelbrot showed up to discuss fractal theory and every artist in NYC wanted to be there. What then happened was very interesting. The Reality Club went online as Edge in 1996 and the scientists were all on email, the artists not. Thus, did Edge surprisingly become a science site when my own background (beginning in 1965 whenJonas Mekas hired me to manage the Film-Makers' Cinematheque) was in the visual and performance arts.

We asked the Edgies to go deeper than the news, the "he said, she said", the tired discussion about the future of media, etc. The editorial marching orders were: "Tell me something I don't know. Explore new ideas about how human beings communicate with each other. As communications is the basis of civilization, this Edge Question is not about computers, not about technology, not about things digital: this is question about our culture and ourselves. The ideas we present here can offer a new set of metaphors to describe ourselves, our minds, the way we think, the world, and all of the things we know in it. ... Be imaginative, exciting, compelling, inspiring. Tell a great story. Make an argument that makes a difference. Amaze and delight. Surprise us!"

To date, 168 essayists (an array of world-class scientists, artists, and creative thinkers) have created a 130,000 document.

You can read the full document on The Edge.

Pages