"Nobody knows, you can't find out, and you don't have to ask permission."

Edge 277—March 17, 2009
(9,000 words)


By Clay Shirky

Dan Dennett
Cute, sexy, sweet and funny — an evolutionary riddle


Seirian Sumner

Edge Video


By Andrew Higgins

By Natalie Angier

10 Ideas Changing The World Right Now

By Brian Walsh

A Conversation With Nassim Taleb

How The Bank Bonuses Let Us All Down
By Nassim Nicholas Taleb

The Looting Of America's Coffers
By David Leonhardt

Belief And The Brain's 'God Spot'
Steve Connor

They Tried To Outsmart Wall Street
By Dennis Overbye


Exploring The Universe Of Possibilites

G.O.P. Senators Say Some Big Banks Can Be Allowed To Fail
By J. David Goodman and Brian Knowlton

Harvard Scientists' Discovery Opens Door To Synthetic Life
By John Lauerman

For Twitter C.E.O., Well-Orchestrated Accidents
By Evan Williams

Charles Darwin And Craig Venter: How Different Could Two Men Be?
By Emma Hartley

Think Big
By Brian Eno

Is Time An Illusion?
By Amanda Gefter

When reality is labeled unthinkable, it creates a kind of sickness in an industry. Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times. One of the effects on the newspapers is that many of their most passionate defenders are unable, even now, to plan for a world in which the industry they knew is visibly going away.

By Clay Shirky

CLAY SHIRKY is an adjunct professor in NYU's graduate Interactive Telecommunications Program (ITP), where he teaches courses on the interrelated effects of social and technological network topology—how our networks shape culture and vice-versa. He is the author of Here Comes Everybody.

Clay Shirky's Edge Bio page



Back in 1993, the Knight-Ridder newspaper chain began investigating piracy of Dave Barry's popular column, which was published by the Miami Herald and syndicated widely. In the course of tracking down the sources of unlicensed distribution, they found many things, including the copying of his column to alt.fan.dave_barry on usenet; a 2000-person strong mailing list also reading pirated versions; and a teenager in the Midwest who was doing some of the copying himself, because he loved Barry's work so much he wanted everybody to be able to read it.

One of the people I was hanging around with online back then was Gordy Thompson, who managed internet services at the New York Times. I remember Thompson saying something to the effect of "When a 14 year old kid can blow up your business in his spare time, not because he hates you but because he loves you, then you got a problem." I think about that conversation a lot these days.

The problem newspapers face isn't that they didn't see the internet coming. They not only saw it miles off, they figured out early on that they needed a plan to deal with it, and during the early 90s they came up with not just one plan but several. One was to partner with companies like America Online, a fast-growing subscription service that was less chaotic than the open internet. Another plan was to educate the public about the behaviors required of them by copyright law. New payment models such as micropayments were proposed. Alternatively, they could pursue the profit margins enjoyed by radio and TV, if they became purely ad-supported. Still another plan was to convince tech firms to make their hardware and software less capable of sharing, or to partner with the businesses running data networks to achieve the same goal. Then there was the nuclear option: sue copyright infringers directly, making an example of them.

As these ideas were articulated, there was intense debate about the merits of various scenarios. Would DRM or walled gardens work better? Shouldn't we try a carrot-and-stick approach, with education and prosecution? And so on. In all this conversation, there was one scenario that was widely regarded as unthinkable, a scenario that didn't get much discussion in the nation's newsrooms, for the obvious reason.

The unthinkable scenario unfolded something like this: The ability to share content wouldn't shrink, it would grow. Walled gardens would prove unpopular. Digital advertising would reduce inefficiencies, and therefore profits. Dislike of micropayments would prevent widespread use. People would resist being educated to act against their own desires. Old habits of advertisers and readers would not transfer online. Even ferocious litigation would be inadequate to constrain massive, sustained law-breaking. (Prohibition redux.) Hardware and software vendors would not regard copyright holders as allies, nor would they regard customers as enemies. DRM's requirement that the attacker be allowed to decode the content would be an insuperable flaw. And, per Thompson, suing people who love something so much they want to share it would piss them off.

Revolutions create a curious inversion of perception. In ordinary times, people who do no more than describe the world around them are seen as pragmatists, while those who imagine fabulous alternative futures are viewed as radicals. The last couple of decades haven't been ordinary, however. Inside the papers, the pragmatists were the ones simply looking out the window and noticing that the real world was increasingly resembling the unthinkable scenario. These people were treated as if they were barking mad. Meanwhile the people spinning visions of popular walled gardens and enthusiastic micropayment adoption, visions unsupported by reality, were regarded not as charlatans but saviors.

When reality is labeled unthinkable, it creates a kind of sickness in an industry. Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times. One of the effects on the newspapers is that many of their most passionate defenders are unable, even now, to plan for a world in which the industry they knew is visibly going away.


The curious thing about the various plans hatched in the '90s is that they were, at base, all the same plan: "Here's how we're going to preserve the old forms of organization in a world of cheap perfect copies!" The details differed, but the core assumption behind all imagined outcomes (save the unthinkable one) was that the organizational form of the newspaper, as a general-purpose vehicle for publishing a variety of news and opinion, was basically sound, and only needed a digital facelift. As a result, the conversation has degenerated into the enthusiastic grasping at straws, pursued by skeptical responses.

"The Wall Street Journal has a paywall, so we can too!" (Financial information is one of the few kinds of information whose recipients don't want to share.) "Micropayments work for iTunes, so they will work for us!" (Micropayments only work where the provider can avoid competitive business models.) "The New York Times should charge for content!" (They've tried, with QPass and later TimesSelect.) "Cook's Illustrated and Consumer Reports are doing fine on subscriptions!" (Those publications forgo ad revenues; users are paying not just for content but for unimpeachability.) "We'll form a cartel!" (…and hand a competitive advantage to every ad-supported media firm in the world.)

Round and round this goes, with the people committed to saving newspapers demanding to know "If the old model is broken, what will work in its place?" To which the answer is: Nothing. Nothing will work. There is no general model for newspapers to replace the one the internet just broke.

With the old economics destroyed, organizational forms perfected for industrial production have to be replaced with structures optimized for digital data. It makes increasingly less sense even to talk about a publishing industry, because the core problem publishing solves — the incredible difficulty, complexity, and expense of making something available to the public — has stopped being a problem.


Elizabeth Eisenstein's magisterial treatment of Gutenberg's invention, The Printing Press as an Agent of Change, opens with a recounting of her research into the early history of the printing press. She was able to find many descriptions of life in the early 1400s, the era before movable type. Literacy was limited, the Catholic Church was the pan-European political force, Mass was in Latin, and the average book was the Bible. She was also able to find endless descriptions of life in the late 1500s, after Gutenberg's invention had started to spread. Literacy was on the rise, as were books written in contemporary languages, Copernicus had published his epochal work on astronomy, and Martin Luther's use of the press to reform the Church was upending both religious and political stability.

What Eisenstein focused on, though, was how many historians ignored the transition from one era to the other. To describe the world before or after the spread of print was child's play; those dates were safely distanced from upheaval. But what was happening in 1500? The hard question Eisenstein's book asks is "How did we get from the world before the printing press to the world after it? What was the revolution itself like?"

Chaotic, as it turns out. The Bible was translated into local languages; was this an educational boon or the work of the devil? Erotic novels appeared, prompting the same set of questions. Copies of Aristotle and Galen circulated widely, but direct encounter with the relevant texts revealed that the two sources clashed, tarnishing faith in the Ancients. As novelty spread, old institutions seemed exhausted while new ones seemed untrustworthy; as a result, people almost literally didn't know what to think. If you can't trust Aristotle, who can you trust?

During the wrenching transition to print, experiments were only revealed in retrospect to be turning points. Aldus Manutius, the Venetian printer and publisher, invented the smaller octavo volume along with italic type. What seemed like a minor change — take a book and shrink it — was in retrospect a key innovation in the democratization of the printed word. As books became cheaper, more portable, and therefore more desirable, they expanded the market for all publishers, heightening the value of literacy still further.

That is what real revolutions are like. The old stuff gets broken faster than the new stuff is put in its place. The importance of any given experiment isn't apparent at the moment it appears; big changes stall, small changes spread. Even the revolutionaries can't predict what will happen. Agreements on all sides that core institutions must be protected are rendered meaningless by the very people doing the agreeing. (Luther and the Church both insisted, for years, that whatever else happened, no one was talking about a schism.) Ancient social bargains, once disrupted, can neither be mended nor quickly replaced, since any such bargain takes decades to solidify.

And so it is today. When someone demands to know how we are going to replace newspapers, they are really demanding to be told that we are not living through a revolution. They are demanding to be told that old systems won't break before new systems are in place. They are demanding to be told that ancient social bargains aren't in peril, that core institutions will be spared, that new methods of spreading information will improve previous practice rather than upending it. They are demanding to be lied to.

There are fewer and fewer people who can convincingly tell such a lie.


If you want to know why newspapers are in such trouble, the most salient fact is this: Printing presses are terrifically expensive to set up and to run. This bit of economics, normal since Gutenberg, limits competition while creating positive returns to scale for the press owner, a happy pair of economic effects that feed on each other. In a notional town with two perfectly balanced newspapers, one paper would eventually generate some small advantage — a breaking story, a key interview — at which point both advertisers and readers would come to prefer it, however slightly. That paper would in turn find it easier to capture the next dollar of advertising, at lower expense, than the competition. This would increase its dominance, which would further deepen those preferences, repeat chorus. The end result is either geographic or demographic segmentation among papers, or one paper holding a monopoly on the local mainstream audience.

For a long time, longer than anyone in the newspaper business has been alive in fact, print journalism has been intertwined with these economics. The expense of printing created an environment where Wal-Mart was willing to subsidize the Baghdad bureau. This wasn't because of any deep link between advertising and reporting, nor was it about any real desire on the part of Wal-Mart to have their marketing budget go to international correspondents. It was just an accident. Advertisers had little choice other than to have their money used that way, since they didn't really have any other vehicle for display ads.

The old difficulties and costs of printing forced everyone doing it into a similar set of organizational models; it was this similarity that made us regard Daily Racing Form and L'Osservatore Romano as being in the same business. That the relationship between advertisers, publishers, and journalists has been ratified by a century of cultural practice doesn't make it any less accidental.

The competition-deflecting effects of printing cost got destroyed by the internet, where everyone pays for the infrastructure, and then everyone gets to use it. And when Wal-Mart, and the local Maytag dealer, and the law firm hiring a secretary, and that kid down the block selling his bike, were all able to use that infrastructure to get out of their old relationship with the publisher, they did. They'd never really signed up to fund the Baghdad bureau anyway.


Print media does much of society's heavy journalistic lifting, from flooding the zone — covering every angle of a huge story — to the daily grind of attending the City Council meeting, just in case. This coverage creates benefits even for people who aren't newspaper readers, because the work of print journalists is used by everyone from politicians to district attorneys to talk radio hosts to bloggers. The newspaper people often note that newspapers benefit society as a whole. This is true, but irrelevant to the problem at hand; "You're gonna miss us when we're gone!" has never been much of a business model. So who covers all that news if some significant fraction of the currently employed newspaper people lose their jobs?

I don't know. Nobody knows. We're collectively living through 1500, when it's easier to see what's broken than what will replace it. The internet turns 40 this fall. Access by the general public is less than half that age. Web use, as a normal part of life for a majority of the developed world, is less than half that age. We just got here. Even the revolutionaries can't predict what will happen.

Imagine, in 1996, asking some net-savvy soul to expound on the potential of craigslist, then a year old and not yet incorporated. The answer you'd almost certainly have gotten would be extrapolation: "Mailing lists can be powerful tools", "Social effects are intertwining with digital networks", blah blah blah. What no one would have told you, could have told you, was what actually happened: craiglist became a critical piece of infrastructure. Not the idea of craigslist, or the business model, or even the software driving it. Craigslist itself spread to cover hundreds of cities and has become a part of public consciousness about what is now possible. Experiments are only revealed in retrospect to be turning points.

In craigslist's gradual shift from ‘interesting if minor' to ‘essential and transformative', there is one possible answer to the question "If the old model is broken, what will work in its place?" The answer is: Nothing will work, but everything might. Now is the time for experiments, lots and lots of experiments, each of which will seem as minor at launch as craigslist did, as Wikipedia did, as octavo volumes did.

Journalism has always been subsidized. Sometimes it's been Wal-Mart and the kid with the bike. Sometimes it's been Richard Mellon Scaife. Increasingly, it's you and me, donating our time. The list of models that are obviously working today, like Consumer Reports and NPR, like ProPublica and WikiLeaks, can't be expanded to cover any general case, but then nothing is going to cover the general case.

Society doesn't need newspapers. What we need is journalism. For a century, the imperatives to strengthen journalism and to strengthen newspapers have been so tightly wound as to be indistinguishable. That's been a fine accident to have, but when that accident stops, as it is stopping before our eyes, we're going to need lots of other ways to strengthen journalism instead.

When we shift our attention from 'save newspapers' to 'save society', the imperative changes from ‘preserve the current institutions' to ‘do whatever works.' And what works today isn't the same as what used to work.

We don't know who the Aldus Manutius of the current age is. It could be Craig Newmark, or Caterina Fake. It could be Martin Nisenholtz, or Emily Bell. It could be some 19 year old kid few of us have heard of, working on something we won't recognize as vital until a decade hence. Any experiment, though, designed to provide new models for journalism is going to be an improvement over hiding from the real, especially in a year when, for many papers, the unthinkable future is already in the past.

For the next few decades, journalism will be made up of overlapping special cases. Many of these models will rely on amateurs as researchers and writers. Many of these models will rely on sponsorship or grants or endowments instead of revenues. Many of these models will rely on excitable 14 year olds distributing the results. Many of these models will fail. No one experiment is going to replace what we are now losing with the demise of news on paper, but over time, the collection of new experiments that do work might give us the journalism we need.


You are a leaf-cutting ant from South America. You will compete against the humans across the aisle in a foraging activity. You're task is to collect as much forage as possible. There's a reason ants are so successful. They're disciplined. They follow a series of rules. The first rule is no talking. Ants can't talk so you can't talk. The second rule is no gestures, facial or otherwise. And to make sure you can't use facial expressions we're going to put a paper bag on your head. The third rule is 'Ant walking'. ...

Seirian Sumner

In this Edge Video, Serian Sumner teaches us a lesson about the social nature of ants. She selects fifteen people in the audience at the Serpentine Gallery in London and tells them to imagine they're ants.


Content on this page requires a newer version of Adobe Flash Player.

Get Adobe Flash player

SEIRIAN SUMNER is a research fellow in evolutionary biology at the Institute of Zoology, Zoological Society of London. Her research focuses on the evolution of sociality—how eusociality evolves and how social behavior is maintained. She has worked with a variety of bees, wasps, and ants from around the world, studying their behavior through observation, experimental manipulation, and molecular analyses, including gene expression. She is especially interested in the origins of sociality and the role of the genome in this major evolutionary transition.

Seirian Sumner's Edge Bio Page

This is the second in a series of Edge Videos of "table-top experiments" presented as part of the 2007 Edge/Serpentine collaboration during Serpentine Gallery Experiment Marathon in London, curated by Hans Ulrich Obrist under the leadership of Director Julia Peyton-Jones. Edge presenters were zoologist Seirian Sumner, archeologist Timothy Taylor, evolutionary biologist Armand Leroi, psychologist Simon Baron-Cohen, geneticist Steve Jones, physicist Neil Turok, embryologist Lewis Wolpert, and psycholgist Steven Pinker and playwright Marcy Kahan. The live event was featured at the Serpentine as part of the Edge/Serpentine collaboration: "What Is Your Formula? Your Equation? Your Algorithm? Formulae For the 21st Century."

Writing in Sueddeutsche Zeitung ("Short Answers To Big Questions"), Feuilleton editor Andrian Kreye noted that:

The experiment is not only represents a collaboration by Brockman and Obrist's of their own work; it is also a continuation of a movement that began in the '60s on America's East Coast. John Cage brought together young artists and scientists for symposia and seminars to see what what would happen in the interaction of big thinkers from different fields. The resulting dialogue, which at the time seemed abstract and esoteric, can today be regarded as the forerunner to interdisciplinary science and the digital culture.


Why are babies cute? Why is cake sweet? Philosopher Dan Dennett has answers you wouldn't expect, as he shares evolution's counterintuitive reasoning on cute, sweet and sexy things. For a topping, try his new theory on why jokes are funny.

DANIEL C. DENNETT is University Professor, Professor of Philosophy, and Co-Director of the Center for Cognitive Studies at Tufts University. His most recent book is Breaking the Spell.

Daniel C. Dennett's Edge Bio Page

"For those seeking substance over sheen, the occasional videos released at Edge.org hit the mark. The Edge Foundation community is a circle, mainly scientists but also other academics, entrepreneurs, and cultural figures.

"Edge's long-form interview videos are a deep-dive into the daily lives and passions of its subjects, and their passions are presented without primers or apologies. The decidedly noncommercial nature of Edge's offerings, and the egghead imprimatur of the Edge community, lend its videos a refreshing air, making one wonder if broadcast television will ever offer half the off-kilter sparkle of their salon chatter.

Mahzarin Banaji, Samuel Barondes, Paul Bloom, Rodney Brooks, Hubert Burda, George Church, Iain Couzin, Helena Cronin, Paul Davies, Daniel C. Dennett, David Deutsch, Jared Diamond, Freeman Dyson, Drew Endy, Peter Galison, Murray Gell-Mann, David Gelernter, Neil Gershenfeld, Anthony Giddens, Gerd Gigerenzer, Daniel Gilbert, Rebecca Goldstein, John Gottman, Brian Greene, Anthony Greenwald, Alan Guth, David Haig, Marc D. Hauser, Walter Isaacson, Daniel Kahneman, Stuart Kauffman, Ken Kesey, Stephen Kosslyn, Lawrence Krauss, Ray Kurzweil, Jaron Lanier, Armand Leroi, Seth Lloyd, Gary Marcus, Ernst Mayr, Marvin Minsky, Sendhil Mullainathan, Dennis Overbye, Dean Ornish, Elaine Pagels, Steven Pinker, Jordan Pollack, Lisa Randall, Martin Rees, Matt Ridley, Lee Smolin, Elisabeth Spelke, Scott Sampson, Robert Sapolsky, Dimitar Sasselov, Stephen Schneider, Martin Seligman, Robert Shapiro, Lee Smolin, Dan Sperber, Paul Steinhardt, Steven Strogatz, Leonard Susskind, Nassim Nicholas Taleb, Richard Thaler, Robert Trivers, Neil Turok, J.Craig Venter, Edward O. Wilson, Richard Wrangham, Philip Zimbardo

Continue to Edge Video

March 17, 2009

By Andrew Higgins

In Turkey, a combative Islamic creationist is seeking to bury Darwinism once and for all. He has plenty of Turkish fans, but hasn't had much success swaying scientists.

ISTANBUL -- As scientists around the world celebrate the 150th anniversary of Charles Darwin's seminal work on evolution, Adnan Oktar, a college dropout turned theorist of Islamic creationism, is working on the fifth volume of a 14-part masterwork that he says will bury Darwinism once and for all.

"Darwin and his theory are dead," says Mr. Oktar, founder and honorary president of the Science Research Foundation, an Istanbul outfit dedicated to debunking the Victorian-era English naturalist. Darwin, says his 52-year-old Turkish scourge, is "Satan's biggest trick on humanity."

Mr. Oktar, who briefly studied interior design, hasn't had much success swaying scientists with the weight of his research. "He is a complete and utter ignoramus," says Richard Dawkins, an evolutionary biologist and Oxford University professor.

The physical weight of Mr. Oktar's work, however, is considerable. Each volume of his anti-Darwin magnum opus, "Atlas of Creation," weighs more than 13 pounds. Also weighing in on his side are very aggressive lawyers. They've repeatedly gone to court in Turkey to silence critics whom Mr. Oktar accuses of spreading "lies and insults." Scores of Web sites have been banned at his behest.

These include the site of Oxford's Prof. Dawkins, which Mr. Oktar -- who writes under the pen name Harun Yahya -- got blocked last fall after it posted an article entitled "Venomous Snakes, Slippery Eels and Harun Yahya." Prof. Dawkins responded to the ban by posting a Turkish translation of the article. Mr. Oktar derides Prof. Dawkins, an outspoken atheist, as "a pagan monk."...

...His "Atlas of Creation" produces thousands of pictures of fossils of birds, snakes and other creatures side by side with what he says are their identical modern kin. Prof. Dawkins derides the exercise as "total inanity" and says Mr. Oktar confuses snakes with eels and makes other elementary blunders.

One of the pictures in the first volume of Mr. Oktar's work features what is labeled as a caddis fly. It is in fact a man-made fishing fly with a metal hook clearly visible. Mr. Oktar says this is a "little detail" and believes that "just 10 pages of my book can defeat Dawkins."

He's offered a reward of 10 million Turkish lira (around $6 million) to anyone who can produce a fossil that proves evolution. He has also invited his Oxford foe to a debate.

Prof. Dawkins says he has no intention of accepting, as that would only "give legitimacy" to "this weird phenomenon." Mr. Oktar, he says, "doesn't know anything about zoology, doesn't know anything about biology. He knows nothing about what he is attempting to refute."

March 17, 2009



By Natalie Angier

...A simple melody with a simple rhythm and repetition can be a tremendous mnemonic device. “It would be a virtually impossible task for young children to memorize a sequence of 26 separate letters if you just gave it to them as a string of information,” Dr. Thaut said. But when the alphabet is set to the tune of the ABC song with its four melodic phrases, preschoolers can learn it with ease.

And what are the most insidious jingles or sitcom themes but cunning variations on twinkle twinkle ABC?

Really great jokes, on the other hand, punch the lights out of do re mi. They work not by conforming to pattern recognition routines but by subverting them. “Jokes work because they deal with the unexpected, starting in one direction and then veering off into another,” said Robert Provine, a professor of psychology at the University of Maryland, Baltimore County, and the author of “Laughter: A Scientific Investigation.” “What makes a joke successful are the same properties that can make it difficult to remember." ...

March 23, 2009



By Brian Walsh

...But what if we could seamlessly calculate the full lifetime effect of our actions on the earth and on our bodies? Not just carbon footprints but social and biological footprints as well? What if we could think ecologically? That's what psychologist Daniel Goleman describes in his forthcoming book, Ecological Intelligence. Using a young science called industrial ecology, businesses and green activists alike are beginning to compile the environmental and biological impact of our every decision — and delivering that information to consumers in a user-friendly way. That's thinking ecologically — understanding the global environmental consequences of our local choices. "We can know the causes of what we're doing, and we can know the impact of what we're doing," says Goleman, who wrote the 1995 best seller Emotional Intelligence. "It's going to have a radical impact on the way we do business."

Over the past couple of decades, industrial ecologists have been using a method called life-cycle assessment (LCA) to break down that web of connection. The concept of the carbon footprint comes from LCA, but a deep analysis looks at far more. The manufacture and sale of a simple glass bottle requires input from dozens of suppliers; for high-tech items, it can include many times more...

Sunday, March 15, 2009



Options trader Nassim Nicholas Taleb made his name and career anticipating the powerful historic events he calls "Black Swans,"which include World War I, the rise of the Internet and the stock market crash of 1987. In two books published in 2001 and 2007, he urges readers to concentrate more on what they don't know than on what they do.

More recently, Taleb has blasted bankers and economists who issued reassuring forecasts right up to the brink of the current global financial crisis. He spoke recently with Washington Post reporter Peter Whoriskey. Excerpts: ...

You're a fierce critic of the entire field of economics. Don't economists know anything?

You have close to a million people out there in economic life. How many people saw the extent of what could happen in this financial crisis? Some people said we'd have a problem of too much leverage, but very few saw the potential total impact that could come out of it. They didn't see the cascading effects that can be produced by a complex system.

Years ago, I noticed one thing about economics, and that is that economists didn't get anything right. I wanted to find out the reason. They would say their models are not perfect. But data show that you do much worse using their models than you would without them. It's a bull [expletive] science.

Can you give a specific example?

Every time I saw [Federal Reserve Chairman Ben] Bernanke [on television], I would have a fit of rage. He claimed that we were in a period of "great moderation." He did not understand that Black Swans are preceded by low volatility and the buildup of hidden risks. He mistook absence of volatility for the absence of risk. It was like someone sitting on dynamite and saying "It's okay, we're safe because nothing has happened."

In a complex system, things that are fragile should be allowed to fail very fast. [Former Fed Chairman Alan] Greenspan and Bernanke let something fragile, like the banks, survive very long. The longer it takes to break, the worse the outcome.

That's why I think Obama needs to start with a new economic team -- Treasury Secretary Tim Geithner and Lawrence Summers were among those who didn't see this coming in the first place. He needs new people who understand complex systems. ...

February 24, 2009


By Nassim Nicholas Taleb

One of the arguments one hears in the compensation debate is that the bonus system used by Wall Street – as John Thain, former Merrill Lynch chief executive, put it – is there to “reward talent”. While I find this notion of “talent” debatable, I fully agree that incentives are the heart of capitalism and free markets – but certainly not that incentive scheme.

In fact, the incentive scheme commonly in place does the exact opposite of what an “incentive” system should be about: it encourages a certain class of risk-hiding and deferred blow-up. It is the reason banks have never made money in the history of banking, losing the equivalent of all their past profits periodically – while bankers strike it rich. Furthermore, it is thatincentive scheme that got us in the current mess. ...

March 10, 2009



By David Leonhardt

Sixteen years ago, two economists published a research paper with a delightfully simple title: "Looting."

The economists were George Akerlof, who would later win a Nobel Prize, and Paul Romer, the renowned expert on economic growth. In the paper, they argued that several financial crises in the 1980s, like the Texas real estate bust, had been the result of private investors taking advantage of the government. The investors had borrowed huge amounts of money, made big profits when times were good and then left the government holding the bag for their eventual (and predictable) losses.

In a word, the investors looted. Someone trying to make an honest profit, Professors Akerlof and Romer said, would have operated in a completely different manner. The investors displayed a "total disregard for even the most basic principles of lending," failing to verify standard information about their borrowers or, in some cases, even to ask for that information.

The investors "acted as if future losses were somebody else's problem," the economists wrote. "They were right."

March 10, 2009


Scientists say they have located the parts of the brain that control religious faith. And the research proves, they contend, that belief in a higher power is an evolutionary asset that helps human survival. Steve Connor reports

...The search for the God spot has in the past led scientists to many different regions of the brain. An early contender was the brain's temporal lobe, a large section of the brain that sits over each ear, because temporal-lobe epileptics suffering seizures in these regions frequently report having intense religious experiences. One of the principal exponents of this idea was Vilayanur Ramachandran, from the University of California, San Diego, who asked several of his patients with temporal-lobe epilepsy to listen to a mixture of religious, sexual and neutral words while measuring their levels of arousal and emotional reactions. Religious words elicited an unusually high response in these patients. ...

March 10, 2009



NEW THEORIES After spending 20 years in the study of physics, Emanuel Derman applied his thinking to stock options

By Dennis Overbye

Emanuel Derman
expected to feel a letdown when he left particle physics for a job on Wall Street in 1985.

After all, for almost 20 years, as a graduate student at Columbia and a postdoctoral fellow at institutions like Oxford and the University of Colorado, he had been a spear carrier in the quest to unify the forces of nature and establish the elusive and Einsteinian "theory of everything," hobnobbing with Nobel laureates and other distinguished thinkers. How could managing money compare?

But the letdown never happened. Instead he fell in love with a corner of finance that dealt with stock options.

"Options theory is kind of deep in some way. It was very elegant; it had the quality of physics," Dr. Derman explained recently with a tinge of wistfulness, sitting in his office at Columbia, where he is now a professor of finance and a risk management consultant with Prisma Capital Partners.

Dr. Derman, who spent 17 years at Goldman Sachs and became managing director, was a forerunner of the many physicists and other scientists who have flooded Wall Street in recent years, moving from a world in which a discrepancy of a few percentage points in a measurement can mean a Nobel Prize or unending mockery to a world in which a few percent one way can land you in jail and a few percent the other way can win you your own private Caribbean island.

They are known as "quants" because they do quantitative finance. Seduced by a vision of mathematical elegance underlying some of the messiest of human activities, they apply skills they once hoped to use to untangle string theory or the nervous system to making money. ...

[plus Mike Brown, Lee Smolin, J. Doyne Farmer, Nassim Taleb, Benoit Mandelbrot, Eric R. Weinstein, and edge.org]

March 9, 2009


Brian Greene, Seth Lloyd, Alan Guth, Andrei Linde, Max Tegmark, Frank Wilczek, Anton Zeilinger, Wojciech Zurek at the Grand Cayman Island meeting of fq(x).

March 8, 2009


By J. David Goodman and Brian Knowlton

The Treasury department made its own news on Sunday, filling three of the top positions under Secretary Timothy F. Geithner. Alan B. Krueger was named as assistant secretary for economic policy; Davis S. Cohen was chosen as assistant secretary in the office of terrorism and financial intelligence, and Kim N. Wallace was named assistant secretary for legislative affairs. The announcements addressed growing concerns that even as the Treasury Department has worked furiously to craft bank bailouts over its first six weeks, the department still had left key positions unfilled.

Mr. Krueger, a longtime Princeton economics professor, was chief economist at the Labor Department under President Clinton. Mr. Krueger wrote a column on economics for The New York Times from 2000 to 2006 and is a contributor to Economix, a Times blog.

March 7, 2009


By John Lauerman

Harvard University scientists are a step closer to creating synthetic forms of life, part of a drive to design man-made organisms that may one day be used to help produce new fuels and create biotechnology drugs.

Researchers led by George Church, whose findings helped spur the U.S. human genome project in the 1980s, have copied the part of a living cell that makes proteins, the building blocks of life. The finding overcomes a major roadblock in making synthetic self-replicating organisms, Church said today in a lecture at Harvard in Cambridge, Massachusetts.

The technology can be used to program cells to make virtually any protein, even some that don't exist in nature, the scientists said. That may allow production of helpful new drugs, chemicals and organisms, including living bacteria. It also opens the door to ethical concerns about creation of processes that may be uncontrollable by life's natural defenses.

"It's the key component to making synthetic life," Church said yesterday in a telephone call with reporters. "We haven't made synthetic life and it's not our primary goal, but this is a huge milestone in that direction."

The work may be immediately helpful to companies such as Synthetic Genomics Inc., headed by J. Craig Venter, trying to make new organisms that perform specific tasks, such as converting buried coal into methane gas that's easier to extract from the ground.

Microbes for Coal

Venter's plan is to create man-made microbes that can help break down the coal in the earth, much as bacteria speed decomposing plant material.

In a conference for alumni today at Harvard, Church described how his team assembled a reconstituted ribosome, the first artificial version of the structure capable of remaking itself. ...

March 8, 2009



By Evan Williams

...In 1997, I moved to California and worked at what is now O'Reilly Media. By 1998, I had acquired enough technical skills to do freelance Web development. In 1999, I started Pyra Labs with a friend, Meg Hourihan, to develop project management programs. Then we started a side project called Blogger, a Web publishing tool. In 2003, we sold that company to Google. I worked for Google for two years.

Several years ago I started Odeo, a podcasting company, with Noah Glass, another friend. I ran that company for 18 months. We started Twitter as a side project within Odeo during that time.

I didn't like the direction Odeo was going. For one thing, Apple made a lot of what we worked on obsolete when it introduced podcasts into iTunes. I bought Odeo back from the investors and moved the assets to another company of mine, Obvious, a Web product development lab now on hiatus. In 2007, I sold Odeo and spun off Twitter into a separate company.

I appointed Jack Dorsey, who was engineer at Odeo, as C.E.O. of Twitter. In October 2008 it became apparent that Twitter required a day-to-day approach from a single leader. I took over as C.E.O., and Jack became chairman and assumed a more strategic position. He had worked in the courier and dispatch field, which is where he got the idea for Twitter — a social network for sending short messages to friends over cellphones and the Internet. ...

March 2, 2009


By Emma Hartley

A short break from work last week found me pottering around the Natural History Museum, enjoying the Darwin exhibition, thinking about the way science has changed in the last century and a half.

On the Tube to South Kensington I'd finished reading the autobiography of Craig Venter, the man who was the first to decode the human genome and, on putting that back in my bag, I'd picked up a discarded copy of Metro (a London freesheet) to find a story about how a "master gene" had been discovered that could turn off cancer. How Darwin would have enjoyed the way that biology has developed. And yet...

Venter's autobiography - A Life Decoded. My genome: my life, sounding like a spoof by Ben Stiller - is a magnificently readable piece of self-aggrandizement by a man who is clearly a great scientist, but who also appears to be driven by an almost unsupportable egomania. Nearly every other scientist mentioned becomes the object of animosity: if they weren't with him, you feel, they were the enemy. The tale of the genome decoding itself consists of Venter hopping from benefactor to benefactor, each enabling him to achieve the promise of a genome more quickly decoded than the last. His haste to reach his goal was the root of other scientists' irritation, but the result will be of almost unimaginable value to our quality of life as a species. And in medical research the difference a few months makes can be measured in lives lost. ...

...Unsure about how serious this objection to Venter's methods are, I looked at some online material which seemed very partial against him. And then I rang my colleague Professor Steve Jones, who had the following to say: "He's an obnoxious, loud-mouthed American prat. But in the end his view was quite justified. The residual hysteria around him seems to be dying down."

How different, though, to Darwin's approach. Realising that his conclusions about man's origins were political dynamite, Darwin did not publish for twenty years. In the end he was propelled into publication by the knowledge that another scientist - Alfred Russel Wallace - had arrived at the same conclusion and would publish first, unless Darwin were quick. So the motivation was the same - glory - but the personalities of the two men separate them by far more than a century and half.

January 19, 2009



Brian Eno

During the 1990s, I used to tease my Silicon Valley friends by saying "Pangloss is alive and well and living in California." Their optimism conceded no downside to the march of progress that internet-bubble America seemed to represent. But, it turns out, there were a lot of village idiots in the global village. The bubble machine is broken and things are going backwards, at a time when the biggest crisis to face us since we left Eden—catastrophic climate change—looms large.

Amid all this I'm surprised to find myself drawn to Pangloss. I feel more optimistic than I have for years, which is why I plan to write a small column each month with some reasons for optimism. Why do I feel this? Partly because climate change is a crisis that we're at last acknowledging: more especially because we've realised that we have to solve it together, all of us.

My friend Stewart Brand—one of the godfathers of environmentalism—has just written a book, Whole Earth Manifesto (Penguin, forthcoming) which lays out the problems, and what might be the solutions. The problems are shocking in their scale, the solutions mind-boggling in their ambition. What we're going to start thinking about soon are global projects: like salting the stratosphere with sulphate crystals to increase the albedo (reflectivity) of the planet; millions of "umbrellas" floated at the Lagrange point (the null-point where the sun and the Earth's gravity cancel each other out) to shade us from the sun; the oceans seeded with iron filings to draw carbon out of the air and into the depths; and many more.

Some of these projects require new technology, some of them just involve scale: to pull them off will require cooperation at a global level, and that in turn will entail whole new systems of governance, consensus creation and enforcement. It won't be a pretty or dignified process, it will be rough and ready. It will sideline many of the currently "great and good" and find its heroes and heroines among the can-do technologists and will-do eccentrics. It's a many-generation project, which, if we pull it off, could reinvent civilisation in a new, co-operative, form.

January 19, 2009

Amanda Gefter

For decades, physicists have been searching for a quantum theory of gravity to reconcile Einstein's general relativity, which describes gravity at the largest scales, with quantum mechanics, which describes the behaviour of particles at the tiniest scales. One reason it has been so difficult to merge the two is that they are built on incompatible views of time. "I am more and more convinced that the problem of time is key both to quantum gravity and to issues in cosmology," says Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Ontario, Canada. ...

... Something has to give. The fact that the universe has no outside, by definition, suggests that quantum mechanics will be the one to surrender - and to many, this suggests that time is not fundamental. In the 1990s, for instance, physicist Julian Barbour proposed that time must not exist in a quantum theory of the universe. All the same, physicists are loath to throw out quantum theory, as it has proven capable of extraordinarily accurate predictions. What they need is a way to do quantum mechanics in the absence of time.

Single quantum event

Carlo Rovelli
, a physicist at the University of Marseille in France, has found just that. In the past year, he and his colleagues have worked out a method to compress multiple quantum events in time into a single event that can be described without reference to time (Physical Review D, vol 75, p 084033).

It is an intriguing achievement. While Rovelli's approach to dealing with time is one of many, and researchers working on other models of quantum gravity may have different opinions on the matter, nearly every physicist agrees that time is a key obstacle to finding an ultimate theory. Rovelli's approach seems tantalisingly close to surmounting that obstacle. His model builds upon research into generalising quantum mechanics by physicist James Hartle at the University of California, Santa Barbara, as well as Rovelli's earlier work on quantum systems. ...

Edited by John Brockman
With An Introduction By BRIAN ENO

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now."
San Francisco Chronicle

"A great event in the Anglo-Saxon culture."
El Mundo

Contributors include: STEVEN PINKER on the future of human evolution • RICHARD DAWKINS on the mysteries of courtship SAM HARRIS on why Mother Nature is not our friend NASSIM NICHOLAS TALEB on the irrelevance of probability ALUN ANDERSON on the reality of global warming ALAN ALDA considers, reconsiders, and re-reconsiders God LISA RANDALL on the secrets of the Sun RAY KURZWEIL on the possibility of extraterrestrial life BRIAN ENO on what it means to be a "revolutionary" HELEN FISHER on love, fidelity, and the viability of marriage…and many others.

Praise for the online publication of
What Have You Change Your Mind About?

"The splendidly enlightened Edge website (www.edge.org) has rounded off each year of inter-disciplinary debate by asking its heavy-hitting contributors to answer one question. I strongly recommend a visit." The Independent

"A great event in the Anglo-Saxon culture." El Mundo

"As fascinating and weighty as one would imagine." The Independent

"They are the intellectual elite, the brains the rest of us rely on to make sense of the universe and answer the big questions. But in a refreshing show of new year humility, the world's best thinkers have admitted that from time to time even they are forced to change their minds." The Guardian

"Even the world's best brains have to admit to being wrong sometimes: here, leading scientists respond to a new year challenge." The Times

"Provocative ideas put forward today by leading figures."The Telegraph

The world's finest minds have responded with some of the most insightful, humbling, fascinating confessions and anecdotes, an intellectual treasure trove. ... Best three or four hours of intense, enlightening reading you can do for the new year. Read it now." San Francisco Chronicle

"As in the past, these world-class thinkers have responded to impossibly open-ended questions with erudition, imagination and clarity." The News & Observer

"A jolt of fresh thinking...The answers address a fabulous array of issues. This is the intellectual equivalent of a New Year's dip in the lake—bracing, possibly shriek-inducing, and bound to wake you up." The Globe and Mail

"Answers ring like scientific odes to uncertainty, humility and doubt; passionate pleas for critical thought in a world threatened by blind convictions." The Toronto Star

"For an exceptionally high quotient of interesting ideas to words, this is hard to beat. ...What a feast of egg-head opinionating!" National Review Online

Today's Leading Thinkers on Why Things Are Good and Getting Better
Edited by John Brockman
Introduction by DANIEL C. DENNETT


"The optimistic visions seem not just wonderful but plausible." Wall Street Journal

"Persuasively upbeat." O, The Oprah Magazine

"Our greatest minds provide nutshell insights on how science will help forge a better world ahead." Seed

"Uplifting...an enthralling book." The Mail on Sunday

Today's Leading Thinkers on the Unthinkable
Edited by John Brockman
Introduction by STEVEN PINKER


"Danger – brilliant minds at work...A brilliant bok: exhilarating, hilarious, and chilling." The Evening Standard (London)

"A selection of the most explosive ideas of our age." Sunday Herald

"Provocative" The Independent

"Challenging notions put forward by some of the world's sharpest minds" Sunday Times

"A titillating compilation" The Guardian

"Reads like an intriguing dinner party conversation among great minds in science" Discover

Today's Leading Thinkers on Science in the Age of Certainty
Edited by John Brockman
Introduction by IAN MCEWAN


"Whether or not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." LA Times

"Belief appears to motivate even the most rigorously scientific minds. It stimulates and challenges, it tricks us into holding things to be true against our better judgment, and, like scepticism -its opposite -it serves a function in science that is playful as well as thought-provoking. not we believe proof or prove belief, understanding belief itself becomes essential in a time when so many people in the world are ardent believers." The Times

"John Brockman is the PT Barnum of popular science. He has always been a great huckster of ideas." The Observer

"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle—a book ro be dog-eared and debated." Seed

"Scientific pipedreams at their very best." The Guardian

"Makes for some astounding reading." Boston Globe

"Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4

"Intellectual and creative magnificence" The Skeptical Inquirer







"deeply passionate"









Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.

John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: editor@edge.org
Copyright © 2009 By Edge Foundation, Inc
All Rights Reserved.