Stewart Brand

Introduction by John Brockman 

When I first met Stewart Brand in 1965, he was sporting a button on which was printed: "America Needs Indians." We were at the headquarters of USCO ("US" company), an anonymous group of artists whose installations and events combined multiple audio and visual inputs, including film, slides, video, lighting, music, and random sounds. We were both wearing remnants of our US Army uniforms. We hit it off immediately and have been in touch consistently for the past forty-one years.

USCO's mantra, "We Are All One," had already been altered to "We Are All One...except Brockman" in order to accommodate my involvement. In 1963, the group had erected a Psychedelic Tabernacle in a church half an hour outside of Manhattan, in Garnerville, New York. It became an obligatory stop for every seeker and guru passing through the area. Stewart lived there (in the steeple) for a while.

Stewart was fascinated with the USCO community of artists—including painter Steve Durkee and poet Gerd Stern—and with Rockland County neighbors such as John Cage, all of whom were reading, studying, and debating Marshall McLuhan's ideas on communications. In fact, at one point USCO went on tour with McLuhan and provided an "intermedia" counterpoint to his talks.

Brand, who preferred the term multimedia to intermedia, performed his "America Needs Indians," piece from 1964 to 1966 and performed "War: God" from 1967 to 1970. He organized The Trips Festival in January 1966 just as I was running "The Expanded Cinema Festival" in New York at Filmmaker's Cinematheque. In March 1966 he created the Whole Earth button (it read: "Why Haven't We Seen a Photograph of the Whole Earth Yet?"). This conceptual piece was the center of his 1968 campaign for a picture of "The Whole Earth", which led, in no small part, to the creation of the ecology movement. He is the king of initially obscure, ultimately compelling conceptual art. Call it reality.

Brand is best known to my generation as the founder, editor, and publisher of the Whole Earth Catalog. I recall visiting him in Menlo Park, California, in 1968 while he was working on the original catalog. His wife at the time, Lois, a Native American mathematician, spent an entire day working on the catalog with a layout person while Brand and I sat together reading and underlining a copy of Norbert Wiener's Cybernetics, a book Cage had handed to me at a dinner in New York. I still have that copy.

Several months later, the oversized catalog arrived packed in a long tube. The original Whole Earth Catalog captured the moment and defined the intellectual climate of the times. A subsequent edition, The Last Whole Earth Catalog, published in 1971, was a number-one best-seller and won Brand the National Book Award.

During the '70s, he often talked about his vision for what he called the personal computer, a term he is often credited with inventing, although he is quick to point out that Alan Kay deserves credit for its coinage. "Alan credits me for being the first to use it in print in '74 in my book Two Cybernetic Frontiers" he says. "I don't recall others using it as a term, and I didn't think I was doing a coinage, just describing the Xerox Alto in an epilogue in the book. By '75 I did use it as the name of a regular section in the CoEvolution Quarterly, well before personal computers existed."

In 1983, Brand sent Dick Farson and Darryl Iconogle of the Western Behavioral Science Institute to see me in New York about a piece of conferencing software called the Onion, which was being used on a bulletin board system called EIES (Electronic Information Exchange System) and run by Murray Turoff. When I demurred, Stewart told me I could be a player or I could choose to sit out the biggest development of the decade. I chose to sit it out.

Stewart was right and wrong. It is the biggest development of the '90s, not the '80s. Inspired by EIES, in 1984 Stewart co founded The Well (Whole Earth 'Lectronic Link), a computer teleconference system for the San Francisco Bay Area, considered a bellwether of the genre.

That year, Stewart's Point Foundation received a publishing advance of $1.3 million for The Whole Earth Software Catalog, a record deal for a paperback original to this day. As a spin-off, he and Kevin Kelly organized the first Hacker's Conference at Fort Cronkite, the old army barracks north of the Golden Gate bridge. It was in his talk at the conference that Stewart spoke his prophetic words, "information wants to be free", before a hacker audience that included Steve Wozniak, Ted Nelson, Captain Crunch (Ted Draper), and Richard Stallman, among others. I was also there. Stewart had convinced Doug Carlston, the founder of Broderbund, and myself, to put up the money to finance the event. Stewart's talk was later published in a May 1985 article inWhole Earth Review entitled "'Keep designing': How the information economy is being created and shaped by the hacker ethic."

Clearly, some of the interesting thinking about the Internet has its origins in ideas formulated by the artists of the '60s, which, wittingly or unwittingly, were carried forward by the enthusiastic young Lieutenant Brand. Considerations of form and content, context, community, and even the hacker ethic were all presaged in part by activities and discussions during that period. (Indeed a recent German feature-length movie—"Das Netz" by Lutz Dammbeck—makes this very point and does it quite well, until it melts down by putting forth the bizarre and absurd thesis that the motivating factor behind the criminally insane murderer Ted Kaczynski—"The Unabomber"—was his desire to stop the network created by Brand and myself. (See the trailer).

In the 1990s, the Los Angeles Times Magazine published a cover story: "Always two steps ahead of others.....(he) is the least recognized, most influential thinker in America." The story was about Stewart Brand. The story was absolutely correct: Stewart Brand is the most influential thinker in America.

— JB

ED. NOTE: The following is an excerpt (Chapter 2) from Fred Turner's new book: From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism by Fred Turner (University Of Chicago Press). Photos supplied by Stewart Brand.

FRED TURNER is an Assistant Professor and the Director of Undergraduate Studies in the Department of Communication at Stanford University. He is the author of Echoes of Combat: The Vietnam War in American Memory and From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism.

Fred Turner's Edge Bio Page



Nuclear scientist Joseph Rotblat campaigned against the atom bomb he had helped unleash. In the Rotblat Memorial Lecture, delivered recently at the Hay Literary Festival, Lord (Martin) Rees wonders whether it's time for today's cyber scientists to heed Rotblat's legacy

DIGITAL MAOISM: The Hazards of the New Online Collectivism


The hive mind is for the most part stupid and boring. Why pay attention to it?

The problem is in the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous.


In "Digital Maosim", an original essay written for Edge, computer scientist and digital visionary Jaron Lanier finds fault with what he terms the new online collectivism. He cites as an example the Wikipedia, noting that "reading a Wikipedia entry is like reading the bible closely. There are faint traces of the voices of various anonymous authors and editors, though it is impossible to be sure".

His problem is not with the unfolding experiment of the Wikipedia itself, but "the way the Wikipedia has come to be regarded and used; how it's been elevated to such importance so quickly. And that is part of the larger pattern of the appeal of a new online collectivism that is nothing less than a resurgence of the idea that the collective is all-wise, that it is desirable to have influence concentrated in a bottleneck that can channel the collective with the most verity and force. This is different from representative democracy, or meritocracy. This idea has had dreadful consequences when thrust upon us from the extreme Right or the extreme Left in various historical periods. The fact that it's now being re-introduced today by prominent technologists and futurists, people who in many cases I know and like, doesn't make it any less dangerous".

And he notes that "the Wikipedia is far from being the only online fetish site for foolish collectivism. There's a frantic race taking place online to become the most "Meta" site, to be the highest level aggregator, subsuming the identity of all other sites".

Where is this leading? Lanier calls attention to the "so-called 'Artificial Intelligence' and the race to erase personality and be most Meta. In each case, there's a presumption that something like a distinct kin to individual human intelligence is either about to appear any minute, or has already appeared. The problem with that presumption is that people are all too willing to lower standards in order to make the purported newcomer appear smart. Just as people are willing to bend over backwards and make themselves stupid in order to make an AI interface appear smart (as happens when someone can interact with the notorious Microsoft paper clip,) so are they willing to become uncritical and dim in order to make Meta-aggregator sites appear to be coherent."

Read on as Jaron Lanier throwns a lit Molotov cocktail down towards Palo Alto from up in the Berkeley Hills...


THE REALITY CLUB: Responses to Lanier's essay from Douglas Rushkoff, Quentin Hardy, Yochai Benkler, Clay Shirky, Cory Doctorow, Kevin Kelly, Esther Dyson, Larry Sanger, Fernanda Viegas & Martin Wattenberg, Jimmy Wales, George Dyson, Dan Gillmor, Howard Rheingold



Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing.

[ED. NOTE: As part of the activites of the Long Now Foundation, Stewart Brand has organized a series of seminars which are held at Fort Mason in San Francisco. "The purpose of the series", Brand writes, "is to build a coherent, compelling body of ideas about long-term thinking, to help nudge civilization toward Long Now's goal of making long-term thinking automatic and common instead of difficult and rare."

Speakers in the series so far include a number of Edgies: Brian Eno, Jared Diamond, George Dyson, Kevin Kelly, Clay Shirky, and Bruce Sterling. All seminars are archived and freely downloadable.

The following Edge feature is based on Kevin Kelly's March 10th talk on "The Next 100 Years of Science: Long-term Trends in the Scientific Method." He's been exploring the theme on his blog, The Technium — JB ]


Science, says Kevin Kelly, is the process of changing how we know things.  It is the foundation our culture and society.  While civilizations come and go, science grows steadily onward.  It does this by watching itself.

Recursion is the essence of science.  For example, science papers cite other science papers, and that process of research pointing at itself invokes a whole higher level, the emergent shape of citation space.  Recursion always does that.  It is the engine of scientific progress and thus of the progress of society.

A particularly fruitful way to look at the history of science is to study how science itself has changed over time, with an eye to what that trajectory might suggest about the future.  Kelly chronicled a sequence of new recursive devices in science...

2000 BC — First text indexes
200 BC — Cataloged library (at Alexandria)
1000 AD — Collaborative encyclopedia
1590 — Controlled experiment (Roger Bacon)
1600 — Laboratory
1609 — Telescopes and microscopes
1650 — Society of experts
1665 — Repeatability (Robert Boyle)
1665 — Scholarly journals
1675 — Peer review
1687 — Hypothesis/prediction (Isaac Newton)
1920 — Falsifiability (Karl Popper)
1926 — Randomized design (Ronald Fisher)
1937 — Controlled placebo
1946 — Computer simulation
1950 — Double blind experiment
1962 — Study of scientific method (Thomas Kuhn)

Projecting forward, Kelly had five things to say about the next 100 years in science...

1)  There will be more change in the next 50 years of science than in the last 400 years.

2)  This will be a century of biology.  It is the domain with the most scientists, the most new results, the most economic value, the most ethical importance, and the most to learn.

3)  Computers will keep leading to new ways of science.  Information is growing by 66% per year while physical production grows by only 7% per year.  The data volume is growing to such levels of  "zillionics" that we can expect science to compile vast combinatorial libraries, to run combinatorial sweeps through possibility space (as Stephen Wolfram has done with cellular automata), and to run multiple competing hypotheses in a matrix.  Deep realtime simulations and hypothesis search will drive data collection in the real world.

4)  New ways of knowing will emerge.  "Wikiscience" is leading to perpetually refined papers with a thousand authors.  Distributed instrumentation and experiment, thanks to miniscule transaction cost, will yield smart-mob, hive-mind science operating "fast, cheap, & out of control."  Negative results will have positive value (there is already a "Journal of Negative Results in Biomedicine"). Triple-blind experiments will emerge through massive non-invasive statistical data collection--- no one, not the subjects or the experimenters, will realize an experiment was going on until later. (In the Q&A, one questioner predicted the coming of the zero-author paper, generated wholly by computers.)

5)  Science will create new levels of meaning.  The Internet already is made of one quintillion transistors, a trillion links, a million emails per second, 20 exabytes of memory.  It is approaching the level of the human brain and is doubling every year, while the brain is not.  It is all becoming effectively one machine.  And we are the machine.

"Science is the way we surprise God," said Kelly.  "That's what we're here for."  Our moral obligation is to generate possibilities, to discover the infinite ways, however complex and high-dimension, to play the infinite game.  It will take all possible species of intelligence in order for the universe to understand itself. Science, in this way, is holy. It is a divine trip.

—Stewart Brand



I wish that you would help build a powerful new early warning system to protect our world from some of its worst nightmares.

Photo by Jonathan Brilliant

Introduction by John Brockman

In the 1970s, Larry Brilliant was one the leaders of the successful World Health Organization smallpox eradication program. More than 500 million people died of smallpox in the 20th Century. Thirty years ago, two million lives a year were still being claimed. Yet in 1980, the disease was completely eradicated from the face of the planet.

Brilliant was a highlight at this year's TED (technology, entertainment, design) Conference where he was a recipient of the 2006 TED Prize, in which the recipient makes a "wish". Among the TED attendees are executives who run world-class companies and have pledged support to help fulfill these wishes. This is in addition to each winner receiving $100,000 to be spent however they choose in support of their wishes.

The same week, during TED, Google hired Brilliant to head The Foundation, which serves as the umbrella organization for Google's philanthropic activities, is funded by 1% of the corporations stock, or, about $1 billion. In his first act as the Executive Director of, Brilliant said Google will join other TED attendees to support formation of an organization to detect early signs of emerging, global health crises, such as bird flu. The name of the project: "The International Network System for Total Early Disease Detection."

Below is a link to the TED streaming video of Larry Brilliant's TED Prize acceptance speech in which he outlines his vision for a "powerful new early warning system to protect our world from some of its worst nightmares".




Why does this strike such a nerve? Because so many of us (not only authors) love books. In their combination of mortal, physical embodiment with immortal, disembodied knowledge, books are the mirror of ourselves. Books are not mere physical objects. They have a life of their own. Wholesale scanning, we fear, will strip our books of their souls. Works that were sewn together by hand, one chapter at a time, should not be unbound page by page and distributed click by click. Talk about "snippets" makes authors flinch.


The late artist James Lee Byars inspired the creation of the Reality Club, which evolved into Edge. Byars was also responsible for the motto of the club. He believed that to arrive at an axiology of societal knowledge it was pure folly to go to a Widener Library and read 6 million books. He kept only four books at a time in a box in his minimally furnished room, replacing books as he read them.

Not for him a universal library. Byars would have been bored by the intense debates swirling around New York regarding Google's recent interest in the world of books. "The Chinese make books to go on their sleeves," he said. "We don't make sleeves like that." His "books", sometimes written in fine calligraphy on the silk sleeves of his elegant plural clothing creations, were written prior to Ray Kurzweil's invention of the ccd scanner in 1975. Byars would not have had time for people who talk about "snippets".

He had another approach: "The World Question Center". In 1971 Byars planned gather together the world's 100 most brilliant minds, lock them in a room together, and have them ask each other the questions they were asking themselves. By "interrogating reality" in this manner, he expected to achieve a synthesis of all thought.

Last month, Edge published George Dyson's piece, Turing's Cathedral, about his visit to Google on the occasion of the 60th anniversary of John von Neumann's proposal for a digital computer. The piece has echoed across the Web and continues to be a lightning rod for discussion. Given the response to his essay, I asked Dyson if he had anything to say about books, authors, and the digital age that goes beyond shopping on the Internet.




For individual scientific work, extending the computational idea, performed, published, or newly applied within the past ten years.

David Deutsch

Recipient of the 2005
$100,000 Edge of Computation Science Prize 

DAVID DEUTSCH is the founder of the field of quantum computation. Paul Benioff, Richard Feynman, and others had written about the possibility of quantum computation earlier, but Deutsch's 1985 paper on Quantum Turing Machines was the first full treatment of the subject, and the Deutsch-Jozsa algorithm is the first quantum algorithm.

When he first proposed it, quantum computation seemed practically impossible. But the last decade has seen an explosion in the construction of simple quantum computers and quantum communication systems. None of this would have taken place without Deutsch's work.

The nominating essay is reproduced in part below.

Although the general idea of a quantum computer had been proposed earlier by Richard Feynman, in 1985 David Deutsch wrote the key paper which proposed the idea of a quantum computer and initiated the study of how to make one. Since then he has continued to be a pioneer and a leader in a rapidly growing field that is now called quantum information science.

Presently, small quantum computers are operating in laboratories around the world, and the race is on to find a scalable implementation that, if successful, will revolutionize the technologies of computation and communications. It is fair to say that no one deserves recognition for the growing success of this field more than Deutsch, for his ongoing work as well as for his founding paper. Among his key contributions in the last ten years are a paper with Ekert and Jozsa on quantum logic gates, and a proof of universality in quantum computation, with Barenco and Ekert (both in 1995).

One reason to nominate Deutsch for this prize is that he has always aimed to expand our understanding of the notion of computation in the context of the deepest questions in the foundations of mathematics and physics. Thus, his pioneering work in 1985 was motivated by interest in the Church-Turing thesis. Much of his recent work is motivated by his interest in the foundations of quantum mechanics, as we see from his 1997 book.


The main papers written by Deutsch that contained "achievement in scientific work that embodies extensions of the computational idea" were in 1985 ("Quantum theory, the Church-Turing principle, and the universal quantum computer") and 1989 ("Quantum computational networks").

His 1995 paper, "Conditional quantum dynamics and logic gates" (with A. Barenco, A. Ekert and R. Jozsa) was an important step in clarifying what sort of physical processes would be needed to implement quantum computation in the laboratory, and what sort of things the experimentalists should be trying to get to work.

"Universality in quantum computation," also written in 1995 (with A. Barenco and A. Ekert) proved the universality of almost all 2-qubit quantum gates, thus verifying his conjecture made in 1989 and showing that quantum computation and quantum gate operations are "built in" to quantum physics far more deeply than classical physics. In 1996, in "Quantum privacy amplification and the security of quantum cryptography over noisy channels" (with A. Ekert, R. Jozsa, C. Macchiavello, S. Popescu and A. Sanpera), he brought quantum cryptography a little bit closer to being practical as opposed to just a laboratory curiosity.

His recent work as seen in the following three papers can be seen as new "applications" of the computational idea, rather than extensions of it.

In 2000, "Information Flow in Entangled Quantum Systems" (with P. Hayden) refutes the long-held belief that quantum systems contain 'non-local' effects, and it does it by appealing to the universality of quantum computational networks, and analysing information flow in those.

Also in 2000, in "Machines, Logic and Quantum Physics" (with A. Ekert and R. Lupacchini), a philosophic paper, not a scientific one, he appealed to the existence of a distinctive quantum theory of computation to argue that our knowledge of mathematics is derived from, and is subordinate to, our knowledge of physics (even though mathematical truth is independent of physics).

In 2002, he answered several long-standing questions about the multiverse interpretation of quantum theory in "The Structure of the Multiverse" — in particular, what sort of structure a "universe" is, within the multiverse. It does this by using the methods of the quantum theory of computation to analyse information flow in the multiverse.

His two main lines of research at the moment, qubit field theory and quantum constructor theory, may well yield important extensions of the computational idea eventually, but at the moment neither of them has yielded any results at all, to speak of, only promising avenues of research.

Born in Haifa, Israel, David Deutsch was educated at Cambridge and Oxford universities. After several years at the University of Texas at Austin, he returned to Oxford, where he now lives and works. Since 1999, he has been a non-stipendiary Visiting Professor of Physics at the University of Oxford, where he is a member of the Centre for Quantum Computation at the Clarendon Laboratory, Oxford University.

In 1998 he was awarded the Institute of Physics' Paul Dirac Prize and Medal. This is the Premier Award for theoretical physics within the gift of the Council of the Institute of Physics. It is made for "outstanding contributions to theoretical (including mathematical and computational) physics." In 2002 he received the Fourth International Award on Quantum Communication for "theoretical work on Quantum Computer Science."

He is the author of The Fabric of Reality [1997].


"Quantum Theory, The Church-Turing Principle, and the Universal Quantum Computer," Proc. Roy. Soc. London A400, 97-117 (1985)

" Quantum computational networks" Proceedings of the Royal Society of London A425:73-90. (1989)

"Conditional quantum dynamics and logic gates" (with A. Barenco, A. Ekert and R. Jozsa) Phys. Rev. Lett. 74 4083-6 (1995)

"Universality in quantum computation" (with A. Barenco and A. Ekert) Proc. R. Soc. Lond. A449 669-77 (1995)

"Quantum privacy amplification and the security of quantum cryptography over noisy channels" (with A. Ekert, R. Jozsa, C. Macchiavello, S. Popescu and A. Sanpera) Phys. Rev. Lett. 77 2818-21 (1996)

"Information Flow in Entangled Quantum Systems" (with P. Hayden) Proc. R. Soc. Lond. A456 1759-1774 (2000)

"Machines, Logic and Quantum Physics" (with A. Ekert and R. Lupacchini) Bulletin of Symbolic Logic 3 3 (2000)

"The Structure of the Multiverse" Proc. R. Soc. Lond.A458 2028 2911-23 (2002)

David Deutsch's Edge bio page 



My visit to Google? Despite the whimsical furniture and other toys, I felt I was entering a 14th-century cathedral — not in the 14th century but in the 12th century, while it was being built. Everyone was busy carving one stone here and another stone there, with some invisible architect getting everything to fit. The mood was playful, yet there was a palpable reverence in the air. "We are not scanning all those books to be read by people," explained one of my hosts after my talk. "We are scanning them to be read by an AI."

When I returned to highway 101, I found myself recollecting the words of Alan Turing, in his seminal paper Computing Machinery and Intelligence, a founding document in the quest for true AI. "In attempting to construct such machines we should not be irreverently usurping His power of creating souls, any more than we are in the procreation of children," Turing had advised. "Rather we are, in either case, instruments of His will providing mansions for the souls that He creates."


Sorry, but the big news at the Frankfurt Book Fair this year was not about the international sales of your new book. It was about the activities of a new (second year) exhibitor: Google.

What is Google doing at the Frankfurt Book Fair? And why has a consortium of publishers filed a lawsuit against them? On the other hand, why do the "digerati" love Google Print and Google Print Library? How does Google's definition of "fair use" as it pertains to the digital domain, square with the notion that as a writer, own my own words? Clearly, we need to redefine "fair use" in the digital age as a "different use" with its own new set of benchmarks.

Whether we're talking about John Cage's idea of "the mind we all share" or H.G. Well's "World Brain", Google has its act together and are at the precipice of astonishing changes in human communication...and ultimately, in our sense of who or what we are. And like nearly all science-driven, technological developments, governments can only play catch-up as no one is going to get to vote for Google's changes, and the current laws, written in a pre-digital age, don't address the new situation.

Some sincerely believe we are entering a golden age of wonder and Google is leading the way. And I am pleased to add from personal experience that the leading players, Eric Schmidt, Sergey Brin and Larry Page, are fine individuals: very serious, highly intelligent, principled. They don't come any better. Still, others believe there are reasons for legitimate fear of a (very near) future world in which the world's knowledge is privatized by one corporation. This could be a problem, a very big problem.

George Dyson visited Google last week at the invitation of some Google engineers. The occasion was the 60th anniversary of John von Neumann's proposal for a digital computer. After the visit, Dyson recalled H.G. Wells' prophecy, written in 1938:

"The whole human memory can be, and probably in a short time will be, made accessible to every individual," wrote H. G. Wells in his 1938 prophecy World Brain. "This new all-human cerebrum need not be concentrated in any one single place. It can be reproduced exactly and fully, in Peru, China, Iceland, Central Africa, or wherever else seems to afford an insurance against danger and interruption. It can have at once, the concentration of a craniate animal and the diffused vitality of an amoeba." Wells foresaw not only the distributed intelligence of the World Wide Web, but the inevitability that this intelligence would coalesce, and that power, as well as knowledge, would fall under its domain. "In a universal organization and clarification of knowledge and ideas... in the evocation, that is, of what I have here called a World Brain... in that and in that alone, it is maintained, is there any clear hope of a really Competent Receiver for world affairs... We do not want dictators, we do not want oligarchic parties or class rule, we want a widespread world intelligence conscious of itself."




By John Brockman

Part of Danny Hillis's charm is his childlike curiosity and demeanor. The first time we talked was on the telephone one Sunday morning in 1988 when he was at his home in Cambridge. We got into a serious discussion about the relationship of physics to computation. "This is interesting," he said. "I'd like to come to New York and continue the conversation face-to-face." Three hours later, my doorbell rang, and there stood a young man, looking like a clean-cut hippie. He had long hair, wore a plain white T-shirt and jeans, and carried nothing. He lived up to his reputation as the "boy wonder". We talked for hours.

Danny's energies at that time were concentrated on getting processors to work together so that computation takes place with communicating processors, as happens with the Internet. The Net's potential to become an organism of intelligent agents interacting with each other, with an intelligence of its own that goes beyond the intelligence of the individual agents fired Danny up. "In a sense," he said, "the Net can become smarter than any of the individual people on the Net or sites on the Net. Parallel processing is the way that kind of emergent phenomenon can happen. The Net right now is only a glimmer of that."

Danny described the Internet of that time simply as a huge document that is stored in a lot of different places and that can be modified by many people at once, but essentially a document in the old sense of the word. In principle, the Internet could be done on paper, but the logistics are much better handled with the computer. "I am interested in the step beyond that," he says, "where what is going on is not just a passive document, but an active computation, where people are using the Net to think of new things that they couldn't think of as individuals, where the Net thinks of new things that the individuals on the Net couldn't think of."

"In the long run, the Internet will arrive at a much richer infrastructure, in which ideas can potentially evolve outside of human minds. You can imagine something happening on the Internet along evolutionary lines, as in the simulations I run on my parallel computers. It already happens in trivial ways, with viruses, but that's just the beginning. I can imagine nontrivial forms of organization evolving on the Internet. Ideas could evolve on the Internet that are much too complicated to hold in any human mind."

The passages quoted above are from my book Digerati, published in 1995, a time when the ideas set forth by Danny were technologically implausible.

In 2000, still on the same track, Danny wrote the prescient paper Aristotle, in which he proposes "The Knowledge Web", again at a time when the technological possibilities did not equal the vision.

But now in 2004, Danny is a grown-up wonder, and we are in the age of Google, of boy wonders Sergey Brin and Larry Page, and the rapidly expanding potential of the Internet as a knowledge web.

In this regard, thanks to funding from the Markle Foundation, Danny been able to assemble a group of people to begin to discuss of the implementation of a medical application based on his ideas. Other possibilities for applications are open-ended. What has changed in the past nine years is that implementation of his ideas are now technologically feasible.

Rather than limit exploration of this set of ideas to a small group of thinkers, Danny has issued an invitation to the participants of Edge—itself an example of a knowledge web, and a very sophisticated one at that—to enter into a discussion on Aristotle.


W. DANIEL (Danny) HILLIS is currently Chairman and Chief Technology Officer of Applied Minds, Inc., is best known for his innovative work in the design and implementation of the massively parallel supercomputer. Applied Minds is a research and development company creating a range of new products and services in software, entertainment, electronics, biotechnology and mechanical design. He is the author of The Pattern On The Stone: The Simple Ideas That Make Computers Work.

Danny Hillis's Edge Bio Page

THE REALITY CLUB: Responses by Douglas Rushkoff, Marc D. Hauser, Stewart Brand, Jim O'Donnell, Jaron Lanier, Bruce Sterling, Roger Schank, George Dyson, Howard Gardner, Seymour Papert, Freeman Dyson, Esther Dyson, Kai Krause, Pamela McCorduck



I've had a suspicion for a while that despite the astonishing success of the first generation of computer scientists like Shannon, Turing, von Neumann, and Wiener, somehow they didn't get a few important starting points quite right, and some things in the foundations of computer science are fundamentally askew. 


In September, 2000, Jaron Lanier, a pioneer in virtual reality, musician, and the lead scientist for the National Tele-Immersion Initiative, weighed forth on Edge against "cybernetic totalism"."For the last twenty years," he wrote, in his "Half a Manifesto" (Edge #74), "I have found myself on the inside of a revolution, but on the outside of its resplendent dogma. Now that the revolution has not only hit the mainstream, but bludgeoned it into submission by taking over the economy, it's probably time for me to cry out my dissent more loudly than I have before." In his manifesto, he took on those "who seem to not have been educated in the tradition of scientific skepticism. I understand why they are intoxicated. There is a compelling simple logic behind their thinking and elegance in thought is infectious."

"There is a real chance," he continued, "that evolutionary psychology, artificial intelligence, Moore's Law fetishizing, and the rest of the package, will catch on in a big way, as big as Freud or Marx did in their times. Or bigger, since these ideas might end up essentially built into the software that runs our society and our lives. If that happens, the ideology of cybernetic totalist intellectuals will be amplified from novelty into a force that could cause suffering for millions of people." "Half a Manifesto" caused a stir, was one of Edge's most popular features, and has been widely reprinted.

Lately, Lanier has been looking at trends in software, and he doesn't like what he sees, namely "a macabre parody of Moore's Law". In this feature, which began as a discussion at a downtown New York restaurant last year, he continues his challenge to the ideas of philosopher Daniel C. Dennett, and raises the ante by taking issue with the seminal work in information theory and computer science of Claude Shannon, Alan Turing, John von Neumann, and Norbert Wiener.


JARON LANIER, a computer scientist and musician, is a pioneer of virtual reality, and founder and former CEO of VPL. He is currently the lead scientist for the National Tele-Immersion Initiative, and visiting scientist, SGI.

Jaron Lanier's Edge Bio Page


Subscribe to RSS - TECHNOLOGY