Edge 278—March 25, 2009
(6,000 words)



40th Anniversary Edition

Our Two Cultures
By Peter Dizikes



Steve Jones


Nicholas Carr, Martin Wattenberg & Fernanda Viégas


Leaping Into The Grand Unknown
By Freeman Dyson

Mirroring The World
Sruthi Krishnan

How To Prevent The Next Pandemic
By Nathan Wolfe

Who Protects The Internet?
By James Geary

Let It Die
By Douglas Rushkoff

Cosmic Visions

The Fertilized Egg Is Not A Human Life
By PZ Myers

The Big Bang Theory
Doctor George Smoot

Two Avant-Gardists Join Artistic Forces
By Larry Blumenfeld

Paul Steinhardt + Peter Galison
Seed Salon

Building The 21st-Century Mind
Howard Gardner Interviewed by Jonah Lehrer

Daniel Dennett Discusses The Problem Of Robotic Warfare

40th Anniversary Edition

By The Late John Brockman

By The Late John Brockman, the first volume of my trilogy was published in 1969. The book was informed by my experiences in New York's avant-garde art world. This context is essential to understanding the endeavor.

During that period, I produced the Expanded Cinema Festival (New Cinema Festival I) at Film-Makers' Cinemateque (1965), the special projects of The New York Film Festival at Lincoln Center (1966); I was "Man of the Year" (1966) at Institute of Contemporary Art (ICA) in Philadelphia, I was behind numerous projects in contemporary culture including Murray the K's World, the first multimedia discotheque (Life cover); the movie Head, and "Intermedia '68", a series of a dozen performance pieces performed at venues such as MOMA, The Brooklyn Academy of Music, and the Albright-Knox Art Gallery in Buffalo.

These activities led to an invitation in 1965 from leading Harvard and MIT scientists in biophysics, computation, and cybernetics to bring a group of New York artists, film-makers, and musicians, to Cambridge. The result, possibly the first art-science symposium, was a watershed event and led to a lifelong exploration of the ideas under discussion. The 2003 feature-length German movie Das Netz argued that the interface between the vision of the cybernetic pioneers and the aesthetics of the New York art world is the key element in understanding what has become Internet culture.

During that period the artists were reading, and talking about, science, and finding ways to render visible scientific ideas in their work. One night at dinner, John Cage handed me a copy of Cybernetics by Norbert Wiener, and said "this is for you". Robert Raushchenberg encouraged me to read about physics, recommending The Mysterious Universe by Sir James Jeans, and One, Two, Three, Infinity by George Gamow. Nam June Paik's video art was an example of the cybernetic idea in action. From Warhol's movies "Sleep" and "Empire" I learned about the perception of time. The work of musicians such as LaMonte Young and Marian Zazeela of the Theatre of Eternal Music, and Terry Riley, left deep impressions about acoustical space. And collaborations with the conceptual artist James Lee Byars gave me an appreciation of the interrogative and enhanced a mutual interest in "Einstein, Gertrude Stein, Wittgenstein, and Frankenstein".

I became first "McLuhanesque" consultant, and worked in industry (General Electric, Scott Paper), the military (The Joint Chiefs of Staff) and government (The White House).

Publication of By The Late John Brockman was proceeded by a performance piece of the work at The Poetry Center at the 92nd Street Y in 1968. I had been invited to participate in a series of a six avant-garde evenings that also included evenings featuring John Cage and Jorge Luis Borges.The book, published in 1969, was printed only on one side of the page.

The second volume of the trilogy, 37 (Holt Rinehart and Winston), was published in 1971. I wrote a third work, Afterwords, included as Part III of a collected works edition published in paperback in 1973 (Anchor) along with the first two works (Parts I and II). This edition was on the long list (ten books) for the National Book Award that year. After Brockman, a volume of essays by contemporaries exploring the challenges posed by my work, was also published in 1973, the year I stopped writing.

This online facsimile edition of the trilogy, published under the original title, marks the 40th anniversary of the 1969 publication of By The Late John Brockman.

John Brockman
New York City, 2009

"There are certain writers whose thought is so important that it doesn't matter whether you agree with them or not. A verbal tension so powerful, an ascetic appetite so huge and consuming forces us both to accept the vision as a revelation and to resist it as a duty. By The Late John Brockman deserves to be read and experienced as few books do in these times of informational overload.

"For John Brockman is the kind of writer you both agree with and don't agree with at all. Either way you must pay a profound attention to what he says in this remarkable book. In short, sharp strokes of words, he breaks through the very forest of meaning by denying meaning, eschewing traditional forms of activities, thoughts and emotions. It is not what he says that is so valuable; it is his whole manner of negating what can be said. His words backtrack on themselves, stalk their own meanings, and thrash about in the underbrush of our sensibilities. There is a total devastation of language, isolating and withering the very hands our dreams are made of." — San Francisco Review of Books (Cover Story) (1969)

"John Brockman’s trilogy is not as incomprehensible as it might initially seem; indeed they are at base quite simple. The first takes information theory — the mathematical theory of communications — as a model for regarding all human experience. The second is a print portrait of Heisenberg’s theory of indeterminacy. The third investigates the limits of words as tools for understanding.

"What distinguishes this trilogy is not their informing hypotheses, which are familiar to various degrees, but the author’s unfettered exploration of their implications. I also admire enormously their style and structure, as well as their remarkable capacity to implant themselves in the reader’s mind." — Richard Kostelanetz, Author, Conversing with Cage, in After Brockman (1973)

"The most important book since Wittgenstein's Tractatus." — (The late) Alan Watts, philosopher, author of The Way of Zen (1969)

"A remarkable achievement....all who are concerned about the violence committed in the name of language will appreciate the useful uselessness of Brockman's un-book." — (The late) Heinz Von Foerster, former Chairman, American Society for Cybernetics; Editor, The Cybernetics of Cybernetics, in After Brockman (1973)

"A terrifying book...depressing...as cerebral as it is icy." — Vogue (1969)

"A unique living fishnet which captures important ideas... there are flashes of cosmic humor, dispassionate critiques, important operations of the mind, and a super head trip." — (The late) John C. Lilly, M.D., Author, Mind of the Dolphin, in After Brockman (1973)

"Part of John Brockman's radical and yet strangely ancient strategy is to embrace those various avenues beyond thought and language that lead directly toward illuminations of the present, toward, in effect, liberation. To occupy those spaces is to be very high indeed." — Rudolph Wurlitzer, Author, Nog (1969)

"Like a Dead Sea Scroll or long-vaulted Beatles out take reel, By the Late John Brockman is destined to recontextualize the works of a century's greatest thinkers. First published in 1969, this radical, seminal work emerges only now, at the dawn of the 21st Century, as a remarkably prescient topology of the landscape directly ahead. This sequence of plainspoken textual fractals are at once soothing and mind-blowing, disorienting yet familiar. Herein lie the navigational keys to the ever changing map of human consciousness." — Douglas Rushkoff, author of Media Virus (1998)


[Note: requires either Safari or Firefox browser]


March 22, 2009



By Peter Dizikes

Few literary phrases have had as enduring an after life as "the two cultures," coined by C. P. Snow to describe what he saw as a dangerous schism between science and literary life. Yet few people actually seem to read Snow's book bearing that title. Why bother when its main point appears so evident?

It was 50 years ago this May that Snow, an English physicist, civil servant and novelist, delivered a lecture at Cambridge called "The Two Cultures and the Scientific Revolution," which was later published in book form. Snow's famous lament was that "the intellectual life of the whole of Western society is increasingly being split into two polar groups," consisting of scientists on the one hand and literary scholars on the other. Snow largely blamed literary types for this "gulf of mutual incomprehension." These intellectuals, Snow asserted, were shamefully unembarrassed about not grasping, say, the second law of thermodynamics — even though asking if someone knows it, he writes, "is about the scientific equivalent of: Have you read a work of Shakespeare's?"...

...Snow's descriptions of the two cultures are not exactly subtle. Scientists, he asserts, have "the future in their bones," while "the traditional culture responds by wishing the future did not exist." Scientists, he adds, are morally "the soundest group of intellectuals we have," while literary ethics are more suspect. Literary culture has "temporary periods" of moral failure, he argues, quoting a scientist friend who mentions the fascist proclivities of Ezra Pound, William Butler Yeats and Wyndham Lewis, and asks, "Didn't the influence of all they represent bring Auschwitz that much nearer?" While Snow says those examples are "not to be taken as representative of all writers," the implication of his partial defense is clear.

Snow's essay provoked a roaring, ad hominem response from the Cambridge critic F. R. Leavis — who called Snow "intellectually as undistinguished as it is possible to be" — and a more measured one from Lionel Trilling, who nonetheless thought Snow had produced "a book which is mistaken in a very large way indeed." Snow's cultural tribalism, Trilling argued, impaired the "possibility of rational discourse."

Today, others believe science now addresses the human condition in ways Snow did not anticipate. For the past two decades, the editor and agent John Brockman has promoted the notion of a "third culture" to describe scientists — notably evolutionary biologists, psychologists and neuroscientists — who are "rendering visible the deeper meanings in our lives" and superseding literary artists in their ability to "shape the thoughts of their generation." Snow himself suggested in the 1960s that social scientists could form a "third culture." ...


LINKS: "The Third Culture" (1991); the 277 Edge Editions published since December 21, 1996.


Steve Jones

STEVE JONES is a biologist; Professor of Genetics at the Galton Laboratory of University College London and well-known television presenter. His most recent books are Coral, and Darwin's Island.

Steve Jones's Edge Bio Page

This is the second in a series of Edge Videos of "table-top experiments" presented as part of the 2007 Edge/Serpentine collaboration during Serpentine Gallery Experiment Marathon in London, curated by Hans Ulrich Obrist under the leadership of Director Julia Peyton-Jones. Edge presenters were zoologist Seirian Sumner, archeologist Timothy Taylor, evolutionary biologist Armand Leroi, psychologist Simon Baron-Cohen, geneticist Steve Jones, physicist Neil Turok, embryologist Lewis Wolpert, and psycholgist Steven Pinker and playwright Marcy Kahan. The live event was featured at the Serpentine as part of the Edge/Serpentine collaboration: "What Is Your Formula? Your Equation? Your Algorithm? Formulae For the 21st Century."

Writing in Sueddeutsche Zeitung ("Short Answers To Big Questions"), Feuilleton editor Andrian Kreye noted that:

The experiment is not only represents a collaboration by Brockman and Obrist's of their own work; it is also a continuation of a movement that began in the '60s on America's East Coast. John Cage brought together young artists and scientists for symposia and seminars to see what what would happen in the interaction of big thinkers from different fields. The resulting dialogue, which at the time seemed abstract and esoteric, can today be regarded as the forerunner to interdisciplinary science and the digital culture.


"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures.

"Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter.

Mahzarin Banaji, Samuel Barondes, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Leonard Susskind, Nassim Nicholas Taleb, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Richard Wrangham, Philip Zimbardo

Continue to Edge Video


Clay Shirky does a superb job of laying out the difficulty, if not the impossibility, of extrapolating the future of the written-news business from its past or its present. The digitization of text, combined with the rise of the Internet as a cheap and capacious distribution system for digital products, has changed, profoundly, the economics of the production and the consumption of news stories. And the economic changes are continuing. Shirky's right: Everything remains in flux.

But while Shirky makes an eloquent argument that the future structure of the written-news business is unknowable, he does make at least one very big assumption about that unknowable future: that news organizations, as we have known them, will not survive. Not only is the traditional form of the newspaper doomed, but its underlying "organizational form"—a group of centrally managed professional journalists—is also doomed. To suggest that the organization may survive, says Shirky, is to perpetuate a lie.

I'm not so sure about that. In fact, I think there's a very good chance that, say, 20 years from now, much of the news we depend on will still be produced by large and fairly traditionally organized journalism companies. There will, to be sure, be many other sources of news or news-like material, produced in ways that, as Shirky suggests, have yet to develop, but history shows that a centrally managed, formally organized company of professional journalists is a pretty good and quite robust way of producing a diverse array of news, whether packaged as a paper or a web site, that people value. Any declaration of the death of that organizational form is premature.

So let me lay out a different scenario for the future of written news, one that begins with the source of the current crisis affecting the business.

The essential problem with the newspaper industry today is that it is suffering from a huge imbalance between supply and demand. What the Internet has done, beyond making the copying and distribution of articles very cheap, is this: it has broken the traditional geographical constraints on news distribution. As a result, the market has been flooded with stories, with product. Any article published by any paper is, in essence, available to any reader. Supply so far exceeds demand that the price of the news has dropped to zero. You no longer have to go to your local paper for a report on a story. Substitutes are everywhere. As I write this, Google News, on its front page, is offering me 5,850 different article about a proposal to recoup AIG's executive bonuses by imposing a new federal tax on them. That's a hell of a lot of supply.

To put it another way, the geographical constraints on the distribution of printed news required the fragmentation of production capacity, with large groups of reporters and editors being stationed in myriad local outlets. When the geographical constraints went away, thanks to the Net and the near-zero cost of distributing digital goods anywhere in the world, all that fragmented (and redundant) capacity suddenly merged together into (in effect) a single production pool serving (in effect) a single market. Needless to say, the combined production capacity now far, far exceeds the demand of the combined market.

In this environment, you're about as like to be able to charge for an online news story as you are to charge for air. And the overabundance of supply means, as well, an overabundance of advertising inventory. So not only can't you charge for your product, but you can't make decent ad revenues either. Hence: Bad times.

Now here's what a lot of people seem to forget: Excess production capacity goes away, particularly when that capacity consists not of capital but of people. Supply and demand, eventually and often painfully, come back into some sort of balance. Newspapers have, with good reason, been pulling their hair out over the demand side of the business, where a lot of their product has, for the time being, lost its monetary value. But the solution to their dilemma may actually lie on the production side: particularly, the radical consolidation and radical reduction of capacity. The number of U.S. newspapers is going to collapse (although we may have differently branded papers produced by the same production operation) and the number of reporters, editors, and other production side employees is going to plummet. The radical reduction in the supply of news is exactly what we're seeing today, with the bankruptcies and layoffs throughout the industry. And it will continue.

Economics tells us that as supply shrinks, market power begins to move back to the producer. The consumer, or user, no longer gets to call all the shots. Substitutes dry up, the perception of the fungibility of news stories dissipates, and differences in quality become, once again, both visible and valuable. The value of news begins, once again, to have a dollar sign beside it.

Shirky has argued, in an earlier article, that we're "in a media environment with low barriers to entry for competition." That's true of "media" broadly defined, but it's not true of the production of quality reporting. The capital requirements for an online news operation are certainly lower than for a print one, but the labor costs remain high. Reporters, editors, photographers, and other newspaper production workers are skilled professionals who require good and fair pay and benefits and, often, substantial travel allowances. It's a fantasy to believe that the production of all the kinds of news that people value, particularly hard news, can be shifted over to amateurs or journeymen working for peanuts or some newfangled journo-syndicalist communes. Certainly, amateurs and volunteers can do some of the work that used to be done by professional journalists in professional organizations. Free-floating freelancers can also do some of the work. The journo-syndicalist communes will, I suppose, be able to do some of the work. And that's all well and good. But they can't do all of the work, and they certainly can't do all of the most valuable work. The news business will remain a fundamentally commercial operation.

When you radically reduce supply in the industry, the demand picture changes radically as well. Ad inventory goes down, and ad rates go up. And things that seem unthinkable now - online subscription fees - suddenly become feasible. That will require, of course, not just a reduction of production capacity but the imposition of control on the copying of content—the establishment of a new regime of intellectual property rights. Shirky tells of the upheaval that came after the invention of the movable type press. But one important element of that story that he doesn't talk about is the creation of copyright and other intellectual-property protections that ultimately served to constrain what the new presses could print. The printing press dramatically reduced copying costs, but that didn't lead to the annihilation of writing and publishing as profit-making pursuits. Just the opposite. The reduction in copying costs, combined with protections for intellectual property, brought into being the modern publishing industry.

Once the production capacity for news "product" is rationalized for an electronic market, the ability of the remaining written-news organizations to impose controls on the distribution of their stories will grow considerably, whether those stories are printed or published exclusively in digital editions. The economics of supporting such organizations will become sustainable again. We will end up with far fewer professional news organizations and their customers will likely represent a smaller fraction of the population—quality journalism may end up being consumed and supported by an elite audience - but I'd say that it's a pretty good bet that the essential form of the organization is likely to survive.



We'd like to speak to our billionaire readers—there must be one or two of you left—about the current buyer's market for prestige. Everyone else, please skip to the next essay.

Getting your name on a top institution used to be pricey. It cost Bill Gates a few billion, with a $36 billion assist from Warren Buffett, to create a first-rate philanthropic foundation. Want your name on a top college? Good luck: even after a market crash, Harvard’s endowment is tens of billions.

But now there's a steal to be had.

Clay Shirky notes that grants and endowments will help support the journalism of the future. The price is surprisingly low. ProPublica, a nonprofit with a Pulitzer-winning staff, operates on $10 million a year (or one sixth of the Stanford athletic department’s budget). In other words, you could permanently endow a serious news organization for about $250 million.

Your foundation—The [YOUR NAME HERE] Times—could be one of the essential voices of the 21st century. And like the first college founded in America, it would have a decent shot at becoming one of the nation’s most admired institutions.

Of course, maybe a 19-year-old whiz kid will invent cheap robot reporters. Or, in absence of journalistic oversight, city councils might adopt the honor system. But writing a big check, dear billionaire reader, seems a lot less risky.

And if you don't save journalism, Larry Ellison might sell his yacht and have his name on the most admired institution of the future—and how annoying would that be?

April 9, 2009

By Freeman Dyson

The Lightness of Being: Mass, Ether, and the Unification of Forces by Frank Wilczek

Frank Wilczek is one of the most brilliant practitioners of particle physics. Particle physics is the science that tries to understand the smallest building blocks of earth and sky, just as biol-ogy tries to understand living creatures. Particle physics is running about two hundred years behind biology. In the eighteenth century, Carl Linnaeus started systematic biology by giving Latin names to species of plants and animals, Homo sapiens for humans and Pan troglodytes for chimpanzees. In the nineteenth century, Darwin created a unified theory for biology by explaining the origin of species. In the twentieth century, Ernest Rutherford laid the ground for particle physics by discovering that every atom has a nucleus that is vastly smaller than the atom itself, and that the nucleus is made of particles that are smaller still. In the twenty-first century, particle physicists are hoping for a new Darwin who will explain the origin of particles.

It is too soon to tell whether Wilczek will be the new Darwin. His book is not the new Origin of Species. It is more like Darwin's Voyage of the Beagle, a popular account of a voyage of exploration, describing the landscape and the newly discovered creatures that still have to be explained. Wilczek is a theoretician and not an experimenter. His strength lies in leaps of the imagination rather than in heavy hardware or heavy calculations. He shared the 2004 Nobel Prize in physics for inventing the concept that he called "Asymptotic Freedom." He writes as he thinks, with a lightness of touch that can come only to one who is absolute master of his subject. He borrowed his title from Milan Kundera, the Czech writer whose novel The Unbearable Lightness of Being takes a gloomier view of lightness. For Wilczek, the lightness of being is not only bearable but exhilarating. ...

March 22, 2009

Video becomes favoured medium with broadband growth

Sruthi Krishnan

..."Video has become a favoured means of consuming content primarily because of the growth of broadband … else it is too painful to stream and view," says N. Udhay Shankar, who founded one of India's earliest web companies and helped to kickstart the Linux movement in India.

While TED (which stands for Technology, Entertainment, Design) is the most well-known of its kind, you can listen to Salman Rushdie talk on the Enchantress at [email protected], of Florence or Brian Cox talking about the God Particle at Edge.Org. ...

March 17, 2009

An international network for monitoring the flow of viruses from animals to humans might help scientists head off global epidemics

By Nathan Wolfe

Sweat streamed down my back, thorny shrubs cut my arms, and we were losing them again. The wild chimpanzees my colleagues and I had been following for nearly five hours had stopped their grunting, hooting and screeching. Usually these calls helped us follow the animals through Uganda's Kibale Forest. For three large males to quiet abruptly surely meant trouble. Suddenly, as we approached a small clearing, we spotted them standing below a massive fig tree and looking up at a troop of red colobus monkeys eating and playing in the treetop.

The monkeys carried on with their morning meal, oblivious to the three apes below. After appearing for a moment to confer with one another, the chimps split up. While the leader crept toward the fig tree, his compatriots made their way up two neighboring trees in silence. Then, in an instant, the leader rushed up his tree screaming. Leaves showered down as the monkeys frantically tried to evade their attacker. But the chimp had calculated his bluster well: although he failed to capture a monkey himself, one of his partners grabbed a juvenile and made his way down to the forest floor with the young monkey in tow, ready to share his catch.

As the chimps feasted on the monkey's raw flesh and entrails, I thought about how this scene contained all the elements of a perfect storm for allowing microorganisms to jump from one species to the next, akin to space travelers leaping at warp speed from one galaxy to another. Any disease-causing agent present in that monkey now had the ideal conditions under which to enter a new type of host: the chimps were handling and consuming fresh organs; their hands were covered with blood, saliva and feces, all of which can carry pathogens; blood and other fluids splattered into their eyes and noses. Any sores or cuts on the hunters' bodies could provide a bug with direct entry into the bloodstream. Indeed, work conducted by my group and others has shown that hunting, by animals such as chimpanzees as well as by humans, does provide a bridge allowing viruses to jump from prey to predator. The pandemic form of HIV began in this way, by moving from monkeys into chimpanzees and, later, from chimpanzees into humans. ...

March 13, 2009

Pull up the wrong undersea cable, and the Internet goes dark in Berlin or Dubai. See our animated infographics of how the web works!

By James Geary

...The Beast is like a lunar lander on steroids. Working at depths of more than a mile, it can trundle along the seabed on caterpillar treads or, when its thrusters kick in, skim above canyons like a hovercraft, at a top speed of three knots. Rennie and his team of six control the Beast via a joystick, using its sonar, video cameras and metal detector to locate damaged cables. Plucking a cable from the ocean floor is akin to picking up a piece of thread in a blizzard while wearing a catcher's mitt. Currents can be fierce, which makes it difficult to hold the Beast steady above the cable. Visibility can be close to nil, which means that even finding the cable in the first place can be a long and frustrating process of trial and error. But according to Rennie, "gripping and cutting is the trickiest." This delicate piece of submarine surgery has to be performed quickly and cleanly, using only a murky video image as a guide.

When Rennie found the U.K.-Ireland cable--fishermen had cut it after it became entangled in a dragnet--the Beast's manipulator arm grabbed it, sliced it clean, and brought each end to the surface. On board the ship, the cable was repaired and x-rayed (Rennie needed to make sure the splice was set right, as with a broken bone), then tested and lowered to the seafloor. "There is no time for celebration when we fix a cable," Rennie says. "There is lots of pressure from cable owners to move quickly. They are losing revenue."

Most cable breaks go unnoticed by users. Maybe a YouTube clip will take someone a nanosecond longer to download, but that's about all anyone might notice when a single cable snaps. There are so many different lines connecting so many different places—a map of the network looks like the inside of a baby grand: strand after strand of cable stretching across the ocean floor like so many piano wires that service providers can usually reroute around any break. But if several cables snap in chorus, as they did several times in the past two years, big problems result. ...

March 15, 2009

Rushkoff on the economy
By Douglas Rushkoff

... If you had spent the last decade, as I have, reviewing the way a centralized economic plan ravaged the real world over the past 500 years, you would appreciate the current financial meltdown for what it is: a comeuppance. This is the sound of the other shoe dropping; it's what happens when the chickens come home to roost; it's justice, equilibrium reasserting itself, and ultimately a good thing.

I started writing a book three years ago through which I hoped to help people see the artificial and ultimately dehumanizing landscape of corporatism on which we conduct so much of our lives. It's not just that I saw the downturn coming—it's that I feared it wouldn't come quickly or clearly enough to help us wake up from the self-destructive fantasy of an eternally expanding economic frontier. The planet, and its people, were being taxed beyond their capacity to produce. Try arguing that to a banker whose livelihood is based on perpetuating that illusion, or to people whose retirement incomes depend on just one more generation falling for the scam. It's like arguing to Brooklyn's latest crop of brownstone buyers that they've invested in real estate at the very moment the whole market is about to tank. (I did; it wasn't pretty.)

Now that the scheme we have mistaken for the real economy is collapsing under its own weight, however, it's a whole lot easier to make these arguments. And, if anything, it's even more important for us to come to grips with the fact that the system in peril is not a natural one, or even one that we should be attempting to revive and restore. The thing that is dying—the corporatized model of commerce — has not, nor has it ever been, supportive of the real economy. It wasn't meant to be. And before we start lamenting its demise or, worse, spending good money after bad to resuscitate it, we had better understand what it was for, how it nearly sucked us all dry, and why we should put it out of our misery. ...

March 3, 2009


With the International Year of Astronomy now in full swing, leading figures from the world of astronomy reveal what they think are the biggest challenges for the subject...

Martin Rees is at the University of Cambridge in the UK and holds the title of Astronomer Royal

A quarter of a century ago, plans for the Hubble Space Telescope and the Keck telescopes were well advanced, and both these instruments are still doing great science. If we look 25 years ahead, the projects that are now at the concept or planning stage will be the major instruments then. The timescale is very long — depressingly so. However, we can be optimistic that rapid advances in computer power will allow realistic modelling of how galaxies, stars and planets formed. Simulations in "virtual universes" will play an ever larger role in our subject.

I would highlight three main challenges for astronomy. The first concerns black holes. These are now recognized as the engines for active galactic nuclei. But we still do not know if they obey the "Kerr metric", which describes the geometry of space–time around a massive rotating body, although I would be astonished if they did not, given the vindication of general relativity. I am hopeful for better probes (and simulations) of flow patterns and magnetic effects in the innermost regions of active galaxies, plus the direct detection of gravitational waves from coalescing black holes. ...

March 17, 2009

By PZ Myers

...Now this person wants a specific quote from a biology text that has the words "human life does not begin at conception" in it. That would be tough, because it's a sentence that rather boggles the brain of any developmental biologist — we also tend not to write sentences like, "human beings are not flies". We kind of expect that anyone intelligent enough to read the textbook doesn't need their hand held in superfluous explications of the bleedin' obvious. But you will find us saying simple things like that in email and conversations and even popular lectures to lay people…such as this talk by Lewis Wolpert.

Wolpert is, of course, one of the best known developmental biologists on the planet. He is also the author of a very good introductory text in developmental biology (Principles of Development, one that I use in my classes at UMM, and in this lecture (which you really should watch and listen to in its entirety, it's very good), he does come right out and say the bleedin' obvious.

What I'm concerned with is how you develop. I know that you all think about it perpetually that you come from one single cell of a fertilized egg. I don't want to get involved in religion but that is not a human being. I've spoken to these eggs many times and they make it quite clear ... they are not a human being.

There, that should help. When you go reaching for an authority in development, a professor at a small liberal arts college isn't the sine qua non of the field (well, unless maybe you're talking about Scott Gilbert…), but you really can't pull rank higher than Lewis Wolpert.

March, 2009

Doctor George Smoot

Go behind the scenes with famous Nobel Laureate Doctor George Smoot on the set of The Big Bang Theory.

March 19, 2009


By Larry Blumenfeld

Two guys meet on a corner in Manhattan's East Village. First guy says to the second: "Why don't you write me an opera?" Second guy shrugs and says "OK."

"It was that simple," said director Richard Foreman of the spark that led to "Astronome: A Night at the Opera," his collaborative work with composer John Zorn that runs through April 5 at Mr. Foreman's Ontological-Hysteric Theater. Seated beside him in Mr. Foreman's book-lined loft, a week before the premiere, Mr. Zorn smiled. "I love that story," he said, "because it's at the heart of downtown, as far as I'm concerned. Things happen by chance all the time, between friends." ...

...Mr. Foreman created a vivid environment for "Astronome": Walls are littered with Hebrew letters, the floor with Tarot cards. Veiled women stare intently. A man with a painted-green face, a black-feathered headdress, and a keenly expressive tongue occasionally tightens a microphone cord around his throat. Bright red strawberries appear and are quickly devoured. Mr. Foreman's offstage voice chimes in here and there: "There lives within me an avenging angel named not"; "It is easy to choose the negative path to avoid things that are painful."

Prior to the production, a woman's voice warns of music "as loud as a rock concert," then offers instructions on the proper use of previously distributed earplugs. "But of course," she adds, "you may prefer to experience the full aural effect the composer intended." The crunching chords, wailing screams, and densely layered sound of Mr. Zorn's score are not all that loud. Yet the effect is decidedly full. The music's intensity belies a slowly unfolding structure that, at points, veers into gently ruminative playing and even pleasing melody.

Mr. Foreman has challenged audiences with audacious spectacles that are equal parts existential dread, cunning wit and avant-garde ingenuity ever since founding his theater in 1968. Mr. Zorn's score ups his ante, yielding a production that stuns with its intricacy, rigor and long-lasting psychic effect. ...

March 22, 2009


Paul Steinhardt's "cyclic model," a radical alternative to the big bang and inflationary cosmology, proposes that the universe's evolution is periodic and that key events shaping its structure occurred before the bang. Peter Galison studies historic fundamental shifts in physics and what types of evidence count as truth. Having first met during their graduate-school days at Harvard, they were quick to accept Seed's invitation to consider: Where is the line between physics and metaphysics? Is infinity unscientific? What is it, ultimately, that we want from science?

March 17, 2009



A professor of cognition and education reveals the five minds you need for success, how to make better decisions, and why ethics are critical.

Howard Gardner is a professor of cognition and education at the Harvard Graduate School of Education. He's also the author of over 20 books and several hundred scholarly articles. Gardner is probably best known in educational circles for his theory of multiple intelligences, which is a critique of the notion that there exists but a single human intelligence that can be assessed by standard psychometric instruments. His most recent book, Five Minds for the Future, offers some advice for policy-makers on how to do a better job of preparing students for the 21st century. Mind Matters editor Jonah Lehrer chats with Gardner about his new book, the possibility of teaching ethics and how his concept of multiple intelligences has changed over time. ...

March, 2009


Edited by John Brockman
With An Introduction By BRIAN ENO

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now."
San Francisco Chronicle

"A great event in the Anglo-Saxon culture."
El Mundo

Contributors include: STEVEN PINKER on the future of human evolution • RICHARD DAWKINS on the mysteries of courtship SAM HARRIS on why Mother Nature is not our friend NASSIM NICHOLAS TALEB on the irrelevance of probability ALUN ANDERSON on the reality of global warming ALAN ALDA considers, reconsiders, and re-reconsiders God LISA RANDALL on the secrets of the Sun RAY KURZWEIL on the possibility of extraterrestrial life BRIAN ENO on what it means to be a "revolutionary" HELEN FISHER on love, fidelity, and the viability of marriage…and many others.

Praise for the online publication of
What Have You Change Your Mind About?

"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent

"A great event in the Anglo-Saxon culture." El Mundo

"As fascinating and weighty as one would imagine." The Independent

"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian

"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times

"Provocative ideas put forward today by leading figures."The Telegraph

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle

"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer

"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail

"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star

"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online

Today's Leading Thinkers on Why Things Are Good and Getting Better
Edited by John Brockman
Introduction by DANIEL C. DENNETT


"The optimistic visions seem not just wonderful but plausible." Wall Street Journal

"Persuasively upbeat." O, The Oprah Magazine

"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed

"Uplifting...an enthralling book." The Mail on Sunday

Today's Leading Thinkers on the Unthinkable
Edited by John Brockman
Introduction by STEVEN PINKER


"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)

"A selection of the most explosive ideas of our age." Sunday Herald

"Provocative" The Independent

"Challenging notions put forward by some of the world's sharpest minds" Sunday Times

"A titillating compilation" The Guardian

"Reads like an intriguing dinner party conversation among great minds in science" Discover

Today's Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Introduction by IAN MCEWAN


"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times

"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times

"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer

"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed

"Scientific pipedreams at their very best." The Guardian

"Makes for some astounding reading." Boston Globe

"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4

"Intellectual and creative magnificence" The Skeptical Inquirer







"deeply passionate"









Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.