Edge 206— March 27, 2007
(10,675 words)


By Steven Pinker

By Denis Dutton

By Jonathan Harris

By Katinka Matson


By P.Z. Myers


From Whole Earth to the Whole Web
Henry Lieberman

What Is Your Dangerous Idea
By Tony Maniaty

Elegant physicist makes string theory sexy
By Alan Boyle

Histories of Violence
By Steven Pinker

Scientist Finds the Beginnings of Morality in Primate Behavior
By Nicholas Wade

Ocean Study Yields a Tidal Wave of Microbial DNA
John Bohannon

By Elaine Pagels and Karen L. King


Reading room: a surfers' guide

With Terry Gross
The Gospel of Judas and the Shaping of Christianity

Intellectual and creative magnificence
Kenneth W. Krause

Intellectual innovator


Our Books, Ourselves
By Malcolm Jones

In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.

by Steven Pinker


Once again, Steven Pinker returns to debunking the doctrine of the noble savage in the following piece based on his lecture at the recent TED Conference in Monterey, California.

This doctrine, "the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood")," he writes. "But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler."

Pinker's notable talk, along with his essay, is one more example of how ideas forthcoming from the empirical and biological study of human beings is gaining sway over those of the scientists and others in disciplines that rely on studying social actions and human cultures independent from their biological foundation.


STEVEN PINKER is the Johnstone Family Professor in the Department of Psychology at Harvard University. His most recent book is The Blank Slate.

Steven Pinker's Edge Bio Page


In sixteenth-century Paris, a popular form of entertainment was cat-burning, in which a cat was hoisted in a sling on a stage and slowly lowered into a fire. According to historian Norman Davies, "[T]he spectators, including kings and queens, shrieked with laughter as the animals, howling with pain, were singed, roasted, and finally carbonized." Today, such sadism would be unthinkable in most of the world. This change in sensibilities is just one example of perhaps the most important and most underappreciated trend in the human saga: Violence has been in decline over long stretches of history, and today we are probably living in the most peaceful moment of our species' time on earth.

In the decade of Darfur and Iraq, and shortly after the century of Stalin, Hitler, and Mao, the claim that violence has been diminishing may seem somewhere between hallucinatory and obscene. Yet recent studies that seek to quantify the historical ebb and flow of violence point to exactly that conclusion.

Some of the evidence has been under our nose all along. Conventional history has long shown that, in many ways, we have been getting kinder and gentler. Cruelty as entertainment, human sacrifice to indulge superstition, slavery as a labor-saving device, conquest as the mission statement of government, genocide as a means of acquiring real estate, torture and mutilation as routine punishment, the death penalty for misdemeanors and differences of opinion, assassination as the mechanism of political succession, rape as the spoils of war, pogroms as outlets for frustration, homicide as the major form of conflict resolution—all were unexceptionable features of life for most of human history. But, today, they are rare to nonexistent in the West, far less common elsewhere than they used to be, concealed when they do occur, and widely condemned when they are brought to light.

At one time, these facts were widely appreciated. They were the source of notions like progress, civilization, and man's rise from savagery and barbarism. Recently, however, those ideas have come to sound corny, even dangerous. They seem to demonize people in other times and places, license colonial conquest and other foreign adventures, and conceal the crimes of our own societies. The doctrine of the noble savage—the idea that humans are peaceable by nature and corrupted by modern institutions—pops up frequently in the writing of public intellectuals like José Ortega y Gasset ("War is not an instinct but an invention"), Stephen Jay Gould ("Homo sapiens is not an evil or destructive species"), and Ashley Montagu ("Biological studies lend support to the ethic of universal brotherhood"). But, now that social scientists have started to count bodies in different historical periods, they have discovered that the romantic theory gets it backward: Far from causing us to become more violent, something in modernity and its cultural institutions has made us nobler.

To be sure, any attempt to document changes in violence must be soaked in uncertainty. In much of the world, the distant past was a tree falling in the forest with no one to hear it, and, even for events in the historical record, statistics are spotty until recent periods. Long-term trends can be discerned only by smoothing out zigzags and spikes of horrific bloodletting. And the choice to focus on relative rather than absolute numbers brings up the moral imponderable of whether it is worse for 50 percent of a population of 100 to be killed or 1 percent in a population of one billion.

Yet, despite these caveats, a picture is taking shape. The decline of violence is a fractal phenomenon, visible at the scale of millennia, centuries, decades, and years. It applies over several orders of magnitude of violence, from genocide to war to rioting to homicide to the treatment of children and animals. And it appears to be a worldwide trend, though not a homogeneous one. The leading edge has been in Western societies, especially England and Holland, and there seems to have been a tipping point at the onset of the Age of Reason in the early seventeenth century.

At the widest-angle view, one can see a whopping difference across the millennia that separate us from our pre-state ancestors. Contra leftist anthropologists who celebrate the noble savage, quantitative body-counts—such as the proportion of prehistoric skeletons with axemarks and embedded arrowheads or the proportion of men in a contemporary foraging tribe who die at the hands of other men—suggest that pre-state societies were far more violent than our own. It is true that raids and battles killed a tiny percentage of the numbers that die in modern warfare. But, in tribal violence, the clashes are more frequent, the percentage of men in the population who fight is greater, and the rates of death per battle are higher. According to anthropologists like Lawrence Keeley, Stephen LeBlanc, Phillip Walker, and Bruce Knauft, these factors combine to yield population-wide rates of death in tribal warfare that dwarf those of modern times. If the wars of the twentieth century had killed the same proportion of the population that die in the wars of a typical tribal society, there would have been two billion deaths, not 100 million.

Political correctness from the other end of the ideological spectrum has also distorted many people's conception of violence in early civilizations—namely, those featured in the Bible. This supposed source of moral values contains many celebrations of genocide, in which the Hebrews, egged on by God, slaughter every last resident of an invaded city. The Bible also prescribes death by stoning as the penalty for a long list of nonviolent infractions, including idolatry, blasphemy, homosexuality, adultery, disrespecting one's parents, and picking up sticks on the Sabbath. The Hebrews, of course, were no more murderous than other tribes; one also finds frequent boasts of torture and genocide in the early histories of the Hindus, Christians, Muslims, and Chinese.

At the century scale, it is hard to find quantitative studies of deaths in warfare spanning medieval and modern times. Several historians have suggested that there has been an increase in the number of recorded wars across the centuries to the present, but, as political scientist James Payne has noted, this may show only that "the Associated Press is a more comprehensive source of information about battles around the world than were sixteenth-century monks." Social histories of the West provide evidence of numerous barbaric practices that became obsolete in the last five centuries, such as slavery, amputation, blinding, branding, flaying, disembowelment, burning at the stake, breaking on the wheel, and so on. Meanwhile, for another kind of violence—homicide—the data are abundant and striking. The criminologist Manuel Eisner has assembled hundreds of homicide estimates from Western European localities that kept records at some point between 1200 and the mid-1990s. In every country he analyzed, murder rates declined steeply—for example, from 24 homicides per 100,000 Englishmen in the fourteenth century to 0.6 per 100,000 by the early 1960s.

On the scale of decades, comprehensive data again paint a shockingly happy picture: Global violence has fallen steadily since the middle of the twentieth century. According to the Human Security Brief 2006, the number of battle deaths in interstate wars has declined from more than 65,000 per year in the 1950s to less than 2,000 per year in this decade. In Western Europe and the Americas, the second half of the century saw a steep decline in the number of wars, military coups, and deadly ethnic riots.

Zooming in by a further power of ten exposes yet another reduction. After the cold war, every part of the world saw a steep drop-off in state-based conflicts, and those that do occur are more likely to end in negotiated settlements rather than being fought to the bitter end. Meanwhile, according to political scientist Barbara Harff, between 1989 and 2005 the number of campaigns of mass killing of civilians decreased by 90 percent.

The decline of killing and cruelty poses several challenges to our ability to make sense of the world. To begin with, how could so many people be so wrong about something so important? Partly, it's because of a cognitive illusion: We estimate the probability of an event from how easy it is to recall examples. Scenes of carnage are more likely to be relayed to our living rooms and burned into our memories than footage of people dying of old age. Partly, it's an intellectual culture that is loath to admit that there could be anything good about the institutions of civilization and Western society. Partly, it's the incentive structure of the activism and opinion markets: No one ever attracted followers and donations by announcing that things keep getting better. And part of the explanation lies in the phenomenon itself. The decline of violent behavior has been paralleled by a decline in attitudes that tolerate or glorify violence, and often the attitudes are in the lead. As deplorable as they are, the abuses at Abu Ghraib and the lethal injections of a few murderers in Texas are mild by the standards of atrocities in human history. But, from a contemporary vantage point, we see them as signs of how low our behavior can sink, not of how high our standards have risen.

The other major challenge posed by the decline of violence is how to explain it. A force that pushes in the same direction across many epochs, continents, and scales of social organization mocks our standard tools of causal explanation. The usual suspects—guns, drugs, the press, American culture—aren't nearly up to the job. Nor could it possibly be explained by evolution in the biologist's sense: Even if the meek could inherit the earth, natural selection could not favor the genes for meekness quickly enough. In any case, human nature has not changed so much as to have lost its taste for violence. Social psychologists find that at least 80 percent of people have fantasized about killing someone they don't like. And modern humans still take pleasure in viewing violence, if we are to judge by the popularity of murder mysteries, Shakespearean dramas, Mel Gibson movies, video games, and hockey.

What has changed, of course, is people's willingness to act on these fantasies. The sociologist Norbert Elias suggested that European modernity accelerated a "civilizing process" marked by increases in self-control, long-term planning, and sensitivity to the thoughts and feelings of others. These are precisely the functions that today's cognitive neuroscientists attribute to the prefrontal cortex. But this only raises the question of why humans have increasingly exercised that part of their brains. No one knows why our behavior has come under the control of the better angels of our nature, but there are four plausible suggestions.

The first is that Hobbes got it right. Life in a state of nature is nasty, brutish, and short, not because of a primal thirst for blood but because of the inescapable logic of anarchy. Any beings with a modicum of self-interest may be tempted to invade their neighbors to steal their resources. The resulting fear of attack will tempt the neighbors to strike first in preemptive self-defense, which will in turn tempt the first group to strike against them preemptively, and so on. This danger can be defused by a policy of deterrence—don't strike first, retaliate if struck—but, to guarantee its credibility, parties must avenge all insults and settle all scores, leading to cycles of bloody vendetta. These tragedies can be averted by a state with a monopoly on violence, because it can inflict disinterested penalties that eliminate the incentives for aggression, thereby defusing anxieties about preemptive attack and obviating the need to maintain a hair-trigger propensity for retaliation. Indeed, Eisner and Elias attribute the decline in European homicide to the transition from knightly warrior societies to the centralized governments of early modernity. And, today, violence continues to fester in zones of anarchy, such as frontier regions, failed states, collapsed empires, and territories contested by mafias, gangs, and other dealers of contraband.

Payne suggests another possibility: that the critical variable in the indulgence of violence is an overarching sense that life is cheap. When pain and early death are everyday features of one's own life, one feels fewer compunctions about inflicting them on others. As technology and economic efficiency lengthen and improve our lives, we place a higher value on life in general.

A third theory, championed by Robert Wright, invokes the logic of non-zero-sum games: scenarios in which two agents can each come out ahead if they cooperate, such as trading goods, dividing up labor, or sharing the peace dividend that comes from laying down their arms. As people acquire know-how that they can share cheaply with others and develop technologies that allow them to spread their goods and ideas over larger territories at lower cost, their incentive to cooperate steadily increases, because other people become more valuable alive than dead.

Then there is the scenario sketched by philosopher Peter Singer. Evolution, he suggests, bequeathed people a small kernel of empathy, which by default they apply only within a narrow circle of friends and relations. Over the millennia, people's moral circles have expanded to encompass larger and larger polities: the clan, the tribe, the nation, both sexes, other races, and even animals. The circle may have been pushed outward by expanding networks of reciprocity, à la Wright, but it might also be inflated by the inexorable logic of the golden rule: The more one knows and thinks about other living things, the harder it is to privilege one's own interests over theirs. The empathy escalator may also be powered by cosmopolitanism, in which journalism, memoir, and realistic fiction make the inner lives of other people, and the contingent nature of one's own station, more palpable—the feeling that "there but for fortune go I".

Whatever its causes, the decline of violence has profound implications. It is not a license for complacency: We enjoy the peace we find today because people in past generations were appalled by the violence in their time and worked to end it, and so we should work to end the appalling violence in our time. Nor is it necessarily grounds for optimism about the immediate future, since the world has never before had national leaders who combine pre-modern sensibilities with modern weapons.

But the phenomenon does force us to rethink our understanding of violence. Man's inhumanity to man has long been a subject for moralization. With the knowledge that something has driven it dramatically down, we can also treat it as a matter of cause and effect. Instead of asking, "Why is there war?" we might ask, "Why is there peace?" From the likelihood that states will commit genocide to the way that people treat cats, we must have been doing something right. And it would be nice to know what, exactly, it is.

[First published in The New Republic, 3.19.07.]

"Danger – brilliant minds at work...A brilliant book: exhilarating, hilarious, and chilling." The Evening Standard (London)

Hardcover - UK
£12.99, 352 pp
Free Press, UK

Paperback - US
$13.95, 336 pp
Harper Perennial

WHAT IS YOUR DANGEROUS IDEA? Today's Leading Thinkers on the Unthinkable With an Introduction by STEVEN PINKER and an Afterword by RICHARD DAWKINS Edited By JOHN BROCKMAN

"A selection of the most explosive ideas of our age." Sunday Herald "Provocative" The Independent "Challenging notions put forward by some of the world’s sharpest minds" Sunday Times "A titillating compilation" The Guardian

"...This collection, mostly written by working scientists, does not represent the antithesis of science. These are not simply the unbuttoned musings of professionals on their day off. The contributions, ranging across many disparate fields, express the spirit of a scientific consciousness at its best — informed guesswork "Ian McEwan, from the Introduction, in The Telegraph

Paperback - US
$13.95, 272 pp
Harper Perennial

Paperback - UK
£7.99 288 pp
Pocket Books

WHAT WE BELIEVE BUT CANNOT PROVE Today's Leading Thinkers on Science in the Age of Certainty With an Introduction by IAN MCEWAN Edited By JOHN BROCKMAN

"An unprecedented roster of brilliant minds, the sum of which is nothing short of an oracle — a book ro be dog-eared and debated." Seed "Scientific pipedreams at their very best." The Guardian "Makes for some astounding reading." Boston Globe Fantastically stimulating...It's like the crack cocaine of the thinking world.... Once you start, you can't stop thinking about that question." BBC Radio 4 "Intellectual and creative magnificance...an impressive array of insights and challenges that will surely delight curious readers, generalists and specialists alike. " The Skeptical Inquirer

This makes instrumental criticism a tricky business. I'm personally convinced that there is an authentic, objective maturity that I can hear in the later recordings of Rubinstein. This special quality of his is actually in the music, and is not just subjectively derived from seeing the wrinkles in the old man's face. But the Joyce Hatto episode shows that our expectations, our knowledge of a back story, can subtly, or perhaps even crudely, affect our aesthetic response.

By Denis Dutton

DENIS DUTTON, who teaches aesthetics at the University of Canterbury is founder and editor of the highly regarded Web publication, Arts & Letters Daily (www.aldaily.com).

Denis Dutton's Edge bio page


It seemed almost too good to be true, and in the end it was. A conscientious pianist who had enjoyed an active if undistinguished career in London falls ill and retreats to a small town. Here she undertakes a project to record virtually the entire standard classical repertoire. Her recordings, CDs made when she was in her late 60s and 70s, are staggering, showing a masterful technique, a preternatural ability to adapt to different styles and a depth of musical insight hardly seen elsewhere.

Born in 1928, the pianist, Joyce Hatto, was the daughter of a music-loving London antiques dealer. As a teenager, she said, she kept practicing during the Blitz, hiding under the piano when the bombs were falling. She claimed later to have known the composers Ralph Vaughan Williams, Benjamin Britten and Carl Orff, to have studied Chopin with the French virtuoso Alfred Cortot and taken advice from the pianist Clara Haskil. She was Arnold Bax's favored interpreter for his Symphonic Variations.

Ms. Hatto made recordings from the 1950s until 1970 — some Mozart and Rachmaninoff — but tending toward light-music potboilers: Hubert Bath's Cornish Rhapsody and Richard Addinsell's Warsaw Concerto. Her career was already in decline when she was given a cancer diagnosis in the early 1970s. She retired to a village near Cambridge with her husband, a recording engineer named William Barrington-Coupe, and a fine old Steinway that Rachmaninoff himself had used for prewar recitals in Britain.

Then came one of the strangest turns in the history of classical music. Starting in 1989, Joyce Hatto began recording CDs for a small record label run by her husband. he began with Liszt, went back to cover Bach and all of the Mozart sonatas and continued with a complete Beethoven sonata set. Then on to Schubert and Schumann, Chopin and more Liszt. She played Messiaen. Her Prokofiev sonatas (all nine) were tossed off with incredible virtuosity. In total she recorded more than 120 CDs — including many of the most difficult piano pieces ever written, played with breathtaking speed and accuracy.

Intriguingly, she gave to the music a developed although oddly malleable personality. She could do Schubert in one style, and then Prokofiev almost as though she was a new person playing a different piano — an astonishing, chameleon-like artistic ability.

We normally think of prodigies as children who exhibit some kind of miraculous ability in music. Joyce Hatto became something unheard of in the annals of classical music: a prodigy of old age — the very latest of late bloomers, "the greatest living pianist that almost no one has heard of," as the critic Richard Dyer put it for himself and many other piano aficionados in The Boston Globe.

Little wonder that when she at last succumbed to her cancer last year at age 77 — recording Beethoven's Sonata No. 26, Les Adieux, from a wheelchair in her last days — The Guardian called her "one of the greatest pianists Britain has ever produced." Nice touch, that, playing Beethoven's farewell sonata from a wheelchair. It went along with her image in the press as an indomitable spirit with a charming personality — always ready with a quote from Shakespeare, Arthur Rubinstein or Muhammad Ali. She also had a clear vision of the mission of musical interpreters, telling The Boston Globe: "Our job is to communicate the spiritual content of life as it is presented in the music. Nothing belongs to us; all you can do is pass it along."

Now it has become brutally clear that "passing along" is exactly what she was up to. Earlier this month, a reader of the British music magazine Gramophone told one of its critics, Jed Distler, that something odd happened when he slid Ms. Hatto's CD of Liszt's Transcendental Études into his computer. His iTunes library, linked to a catalogue of about four million CDs, immediately identified it as a recording by the Hungarian pianist Laszlo Simon. Mr. Distler then listened to both recordings, and found them identical.

Since then, analysis by professional sound engineers and piano enthusiasts across the globe has pushed toward the same conclusion: the entire Joyce Hatto oeuvre recorded after 1989 appears to be stolen from the CDs of other pianists. It is a scandal unparalleled in the annals of classical music.

Ms. Hatto usually stole from younger artists who were not household names, although on the basis of the reviews she received, they richly deserved to be. Her recording of Chopin mazurkas seems to be by Eugen Indjic; the fiendishly difficult transcription of Chopin studies by Leopold Godowsky are actually recordings by Carlo Grante and Marc-André Hamelin; her Messiaen recordings were by Paul S. Kim; her version of the Goldberg Variations of Bach at least in part by Pi-Hsien Chen; the complete Ravel piano music by Roger Muraro. As reports come in, the rip-off list grows daily.

Her concerto recordings are even more brazen. The CD labels say they were made with the National Philharmonic Symphony Orchestra, always conducted by one René Köhler. Mr. Barrington-Coupe told a reporter that this was his name for a pick-up orchestra of Polish émigrés who, he said, came out from London to record at a venue he now refuses to reveal. He declined to further discuss the orchestra on the grounds that they were employed "below union rates." No one has yet been able to find a single reference to this René Köhler outside of the Joyce Hatto recordings, nor have any members of the orchestra come forward to confirm Mr. Barrington-Coupe's story.

In a rapturous review of Ms. Hatto's playing of Rachmaninoff's Third Concerto, one critic said of the orchestra musicians: "It doesn't matter who they are, their playing is tight and hot." Actually, it did matter, since they have turned out to be the Philharmonia Orchestra of London, conducted by Esa-Pekka Salonen, performing with the formidable Yefim Bronfman. Her version of the Brahms Second Concerto is Vladimir Ashkenazy's, with the Vienna Philharmonic under Bernard Haitink laboring in the name of René Köhler and his non-union Poles.

Since the news broke, some have likened the exploits of Joyce Hatto to the notorious 20th-century Vermeer forger Han van Meegeren. But the differences are significant. Van Meegeren's success was based as much on presentation — stories of old Italian families impoverished before World War II and needing quick cash — as on artistic plausibility. After he confessed, it was not hard for anyone to see that his dreadful fakes had more in common with each other than with any original Vermeers.

Joyce Hatto, however, was not a pianistic forger. In order to forge a piano performance, she would have had to record Beethoven's Hammerklavier herself and sell it to the world as a lost recording by, say, William Kapell. She was instead a plagiarist: she stole other pianists' work and, with only a few electronic alterations, sold it as her own.

Although the critics who praised Van Meegerens's "Vermeers" as masterpieces were in the end rightly humiliated, the same should not be true of those who praised Ms. Hatto's recordings. They may have been fooled, but their opinions were not foolish, because the artists she ripped off played beautifully.

Yet the Joyce Hatto episode is a stern reminder of the importance of framing and background in criticism. Music isn't just about sound; it is about achievement in a larger human sense. If you think an interpretation is by a 74-year-old pianist at the end of her life, it won't sound quite the same to you as if you think it's by a 24-year-old piano-competition winner who is just starting out. Beyond all the pretty notes, we want creative engagement and communication from music, we want music to be a bridge to another personality. Otherwise, we might as well feed Chopin scores into a computer.

This makes instrumental criticism a tricky business. I'm personally convinced that there is an authentic, objective maturity that I can hear in the later recordings of Rubinstein. This special quality of his is actually in the music, and is not just subjectively derived from seeing the wrinkles in the old man's face. But the Joyce Hatto episode shows that our expectations, our knowledge of a back story, can subtly, or perhaps even crudely, affect our aesthetic response.

The greatest lesson for us all ought to be, however, that there are more fine young pianists out there than most of us realize. If it wasn't Joyce Hatto, then who did perform those dazzlingly powerful Prokofiev sonatas? Having been so moved by hearing "her" Schubert on the radio, I've vowed to honor the real pianist by ordering the proper CD, as soon as I find out who it is. Backhanded credit to Joyce Hatto for having introduced us to some fine new talent.

[First published as an OpEd piece in The New York Times, February 26, 2007.]

As humans, we have a long history of projecting our great stories into the night sky. This leads us to wonder: if we were to make new constellations today, what would they be? If we were to paint new pictures in the sky, what would they depict? These questions form the inspiration for Universe, which explores the notions of modern mythology and contemporary constellations.

by Jonathan Harris



One of the highlights of this year's interesting and eclectic TED Conference in Monterey, California, organized by TED "curator" Chris Anderson, was the premiere a new work by Jonathan Harris, a New York artist and storyteller working primarily on the Internet. His work involves the exploration and understanding of humans, on a global scale, through the artifacts they leave behind on the Web.

"Universe, he writes, "was inspired by questions like: if we could draw new constellations in our night sky today, what would those be? What are our great stories? What are our great journeys? Who are our heroes and heroines? Who are our Gods and Goddesses? What is our modern mythology? Universe tries to answer these questions through analysis of global media coverage, as construed by Daylife."

"Universe presents an interactive night sky, composed of thousands of twinkling stars, which then connect to form constellations. Each of these constellations has a specific counterpart in the physical world — a story, a person, a quote, an image, a company, a nation, a mythic theme. Any constellation can be clicked, making it the center of the universe, and causing all other stars to enter its orbit. Universe is infinitely large, and each person's path through it will be different. For an explanation of how it works, read 'Stages'. For a longer discussion of the ideas behind the piece, read 'Statement'."

Jonathan Harris invites you "to start exploring, get lost, find something amazing, and make your own mythology". Click here for Jonathan Harris's "Universe".


JONATHAN HARRIS is an New York artist and storyteller working primarily on the Internet. His work involves the exploration and understanding of humans, on a global scale, through the artifacts they leave behind on the Web.

Jonathan Harris' Edge bio page


Whether we live in a city, where the night sky bleeds orange with the glow of cars and buildings, or whether we live in the country, where the night sky is pitch black, punctured by myriad tiny points of light, we have all, on a dark night, tilted our head back and looked up. Most of us can spot the North Star, the big dipper, and the three-star belt of Orion the Hunter. With some more practice, we can see Pisces, Pegasus, and the Gemini twins. Each night, the great stories of ancient Greek mythology are played out in the sky — Perseus rescues Andromeda from the sea monster; Orion faces the roaring bull; Zeus battles Cronos for control of Mount Olympus. Most of us know the sky holds these great myths, immortalized as constellations. Slightly less well known are the newer constellations, largely added in the 18th and 19th centuries. These more modern constellations reflect a different sort of mythology — a commemoration of art and science, expressed through star groups representing technical inventions like the microscope, the triangle, the compass, the level, and the easel.

As humans, we have a long history of projecting our great stories into the night sky. This leads us to wonder: if we were to make new constellations today, what would they be? If we were to paint new pictures in the sky, what would they depict? These questions form the inspiration for Universe, which explores the notions of modern mythology and contemporary constellations. It is easy to think that the world today is devoid of mythology. We obsess over celebrities, music, movies, fashion and trends, changing madly from one moment to the next, causing our heroes and idols to come and go so quickly that no consistent mythology can take root. Especially for those who don't practice religion, it can seem there is nothing bigger in which to believe, that there is no shared experience that unites the human world, no common stories to guide us. Because of this, we are said to feel a great emptiness.

We can imagine that people first made constellations to humanize the sky, to make the infinite darkness seem less foreboding. Now that we live in cities of light, bathed in the glow of televisions, headlights, shops, signs, and streetlamps, our battle with darkness seems to be won. But the things that darkness represents — the unknown, the unconquered, and the endless — live on as ever, and we continue to need mythology to help us reconcile that which science and technology cannot answer. So, what is the mythology of today? What are the great stories? What are the great journeys? Who are the heroes and villains? When we step back and look at life, what are its overarching themes? We could ask a panel of experts, or as before, we could leave it to a few ambitious astronomers. But those approaches no longer seem right. Even as we participate in the human world, each of us experiences life differently. We have our own interests, perspectives, opinions, tastes and beliefs. We have our own heroes, our own favorite stories, our own rituals and traditions. In many ways, what we have today are personal mythologies, practiced by a world of individuals.

Universe is a system that supports the exploration of personal mythology, allowing each of us to find our own constellations, based on our own interests and curiosities. Everyone's path through Universe is different, just as everyone's path through life is different. Using the metaphor of an interactive night sky, Universe presents an immersive environment for navigating the world's contemporary mythology, as found online in global news and information from Daylife. Universe opens with a color-shifting aurora borealis, at the center of which is a moon, and through which thousands of stars slowly move. Each star has a specific counterpart in the physical world — a news story, a quote, an image, a person, a company, a team, a place — and moving the cursor across the star field causes different stars to connect, forming constellations. Any constellation can be selected, making it the center of the universe, and sending everything else into its orbit.

Universe is divided into nine "Stages", titled: Stars, Shapes, Secrets, Stories, Statements, Snapshots, Superstars, Settings, and Time. Stars presents a cryptic star field; Shapes causes constellation outlines to emerge; Secrets extracts the most salient single words and presents them to scale; Stories extracts the sagas and events; Statements extracts the things people said; Snapshots extracts images; Superstars extracts the people, places, companies, teams, and organizations; Settings shows geographical distribution; Time shows how the universe has evolved over hours, days, months, and years. In the top left corner is a search box, which can be used to specify the scope of the current universe. The scope can be as broad as "2007", as recent as "Today", as precise as "Vermont on August 27, 2006", or as open-ended as "War", "Climate Change" or "Happiness". The exact parameters of each universe are entirely up to the viewer, and unexpected paths unfold with exploration.

Universe does not suggest a single shared mythology. Instead, it provides a tool to explore many personal mythologies. Based on the chosen path of the viewer, Universe presents the most salient stories, statements and snapshots, as found in global news coverage from thousands of sources. Through this process of guided discovery, patterns start to emerge. Certain stories show up again and again, and they become our great sagas. Certain people start to shape the news, and they become our heroes and villains. Certain single words rise from the chatter, and they become our epic themes.

In Universe, as in reality, everything is connected. No event happens in isolation. No company exists in a vacuum. No person lives alone. Whereas news is often presented as a series of unrelated static events, Universe strives to show the broader narrative that contains those events. The only way to begin to see the mythic nature of today's world is to surface its connections, patterns, and themes. When this happens, we begin to see common threads — myths, really — twisting through the stream of information.

universe.daylife.com | Statement | Stages | Daylife

Orr would be better served by putting up a clear statement of what god he is defending, rather than shuttling back and forth between the supernatural being Dawkins is addressing and the innocuously ideational metaphysical force that no one is crucifying.

P.Z. MYERS [3.27.07]

H. Allen Orr and Daniel Dennett have been tearing into each other something fierce, and it's all over Orr's dismissive review of Dawkins' The God Delusion. The exchanges are a bit splintery and sharp, but the core of Orr's complaint is that he's unimpressed with Dawkins' 'Ultimate 747' argument, which is basically that postulating an immensely complicated being to explain the creation of an immensely complicated universe doesn't actually explain anything and is self-refuting — if you need an intelligent superbeing to create anything complex, then the superbeing itself is an even greater problem for your explanation. Orr considers Dawkins' argument practically a facile parody, and is incredulous that he hasn't considered that perhaps God is much simpler than the universe.
Orr is looking at it in the wrong way, and part of his problem is a failure to define the god he is talking about. If we are talking about something that is not necessarily complex like the universe, that is basic and fundamental and that we derive in some way from something as essential as the laws of existence, then we are not addressing the existence of the god worshipped by almost any religion in existence. Yes, we could equate "god" with simplicity, but that's Einstein's or Spinoza's god, which are not a problem. In his book, Dawkins clearly lays out his terms and states his position: he sets aside the deistic or pantheistic god as outside his argument, which is focused on the concept of the supernatural god as a conscious, independent being, the kind of god that is the day-to-day object of entreaties and worship around the world. Dawkins explicitly divorces his argument from the idea of god as impersonal primal force, which the 'Ultimate 747' argument does not address, and instead focuses on the kind of god-concept we have to deal with on a regular basis in the real world — not the abstraction of theologians, but the capricious, vindictive, meddling magic man of the churches and the weekly prayer meetings and the televangelists.

Dawkins goes so far as to accuse those who conflate Einstein's abstraction with the the kind of personal god worshipped by hundreds of millions of people of "intellectual high treason." I don't quite agree with that, but it certainly is intellectual foolishness. I like Orr's work, I usually greatly enjoy his reviews, but in this case he is, perhaps unconsciously rather than deliberately, confusing the pantheistic cosmic force he is unnecessarily defending from Dawkins' argument with the righteous anthropomorphic Supreme Being that is actually refuted.

And yes, I know it is the nature of religion that everyone who believes will automatically state that their god isn't the complicated caricature of the Bible or the Torah or the Koran and will retreat to the safety of the Ineffable (but Simple) Pantheistic/Deistic God until the challenge from the atheist subsides. Once the critic is safely out of earshot, though, then they will pray to the fickle deity for the new raise or that their favorite football team will win, and they will wonder if the cruel Old Testament God will torture them for eternity for transgressions against antique laws of propriety. Until that atheist glances their way again  ...  then once more, they will describe God as an abstraction, as Love, as something so nebulous that it is safely removed from any specific attack. It's familiar territory. Get into an argument with someone over Christianity or Islam or any of the dominant monotheistic faiths, and you'll see them flicker back and forth between the abstract and the real god of their religion — their only defense is to present a moving target. Dawkins made a sharp distinction in the opening chapters of his books between a non-specific metaphysic and the operational association of religion, the supernatural, and a discrete intelligent entity who personally cares for individual human beings; would that his critics would be equally clear in their definitions.

Orr would be better served by putting up a clear statement of what god he is defending, rather than shuttling back and forth between the supernatural being Dawkins is addressing and the innocuously ideational metaphysical force that no one is crucifying. I suspect that if he did so, he'd either find himself agreeing with Dawkins, or finding his choice of god bedeviled with a very pointed criticism, one he can't dismiss so easily.

P.Z. MYERS, a biologist and associate professor, University of Minnesota, Morris, is a science blogger via his weblog, Pharyngula.

P.Z. Myers Edge Bio Page

Science 9 March 2007

From Whole Earth to the Whole Web

Henry Lieberman

We're pretty damn lucky we got the Internet we did: a worldwide network in which almost anybody can read, publish, and program pretty much anything. It didn't have to turn out that way. It could have been dominated by a few corporations, spoon-feeding junk-food media to the masses, just like television. Or balkanized communications providers could have saddled users with deceptive charging schemes and stifled technical innovation, just like cell phones.

That we happened to get such an open network was a miracle. But it wasn't an accident. The technical community that built today's digital infrastructure did so around a certain set of cultural values, among them openness, sharing, personal expression, and innovation. These were core values of the early digital pioneers (the hackers), embodied in what we proudly call the "hacker ethic." Today, we take the digital revolution for granted and seldom appreciate to what extent these values were sparked by the 1960s counterculture, which preceded the digital revolution: counterculture begat cyberculture.

Because of the happy coincidence that the corporate and bureaucratic establishments of the time understood digital technology so poorly, the hackers were able to pull off the revolution before the bureaucracy knew what hit them. Like the fall of communism, it happened so fast that we haven't yet really taken the time to fully celebrate its victory and examine how it happened.

Fred Turner's fascinating From Counterculture to Cyberculture gives us a detailed look at one slice through this marvelous story. Unlike many other histories that focus on the technical innovators--the Vin Cerfs, the Tim Berners-Lees, the Alan Kays, the Marvin Minskys--this account focuses on a key player whose role was making the counterculture-cyberculture connection: Stewart Brand. Brand's contribution was reporting on this phenomenon; theorizing about it; popularizing it; cheerleading for it; and organizing, networking, and providing resources for it. Brand articulated the unspoken consensus values of the communities. It's hard to say exactly what he did, but everybody knew him, and that sure helped.


Weekend Australian
March 24, 2007 Saturday

What is Your Dangerous Idea? Today's Leading Thinkers on the Unthinkable

BRAIN stretch is an exciting concept, the more so as John Brockman's anthology pushes everything to the extreme. Can our brains exist without bodies? If, as Ray Kurzweil says, ''we need only 1 per cent of 1 per cent of the sunlight to meet all our energy needs'', why are we pouring billions into Middle East wars over oil and not into research on nano-engineered solar panels and fuel cells? Read these 100 or so mini-essays and realise how lacking in vision most politicians are.

March 23, 2007

Elegant physicist makes string theory sexy
Brian Greene does theoretical physics ... and Hollywood as well
By Alan Boyle, Science editor

If you're trying to impress the geeks, being a professional string theorist would have to put you pretty high up on the coolness scale. And if you're a string theorist with books, movies and TV shows to your credit, so much the better.

By those measures, Columbia University physicist Brian Greene has already achieved superstring stardom: His book about string theory, "The Elegant Universe," broke onto bestseller lists and spawned a "Nova" documentary series by the same name (which you can watch online). He has consulted with — and taken cameo roles in — movies ranging from "Frequency" to "Deja Vu" to "The Last Mimzy" (which opens Friday). He's made the talk-show circuit, from "Nightline" and Letterman to "The Colbert Report." And as if all that wasn't enough, he's also organizing a World Science Festival in New York City.


The New Republic

Histories of Violence

by Steven Pinker
Only at TNR Online

Here are some of the most important books about violence, its evolution, and its uses during the twentieth century.

• Thomas Hobbes, Leviathan (1651). "And the life of man, solitary, poor, nasty, brutish, and short." This pithy description of life in a state of nature is just one example of the lively prose in this seventeenth-century masterpiece. Hobbes's analysis of the roots and varieties of violence is uncannily modern, and anticipated many insights from game theory and evolutionary psychology. He also was the first cognitive scientist, outlining a computational theory of memory, imagination, and reasoning.

• Martin Daly and Margo Wilson, Homicide (1988). This is the book that sold me on evolutionary psychology. Daly and Wilson use homicide statistics as an assay for human conflict, together with vivid accounts from history, journalism, and anthropology. They select each of the pairings of killer and victim—fratricide, filicide, parricide, infanticide, uxoricide, stepparent-stepchild, acquaintances, feuds & duels, amok killers, and so on—and test predictions from evolutionary theory on their rates and patterns. The book is endlessly insightful and beautifully written. ...


New York Times
The New York Times
March 20
, 2007

Scientist Finds the Beginnings of Morality in Primate Behavior
By Nicholas Wade

Some animals are surprisingly sensitive to the plight of others. Chimpanzees, who cannot swim, have drowned in zoo moats trying to save others. Given the chance to get food by pulling a chain that would also deliver an electric shock to a companion, rhesus monkeys will starve themselves for several days.

Biologists argue that these and other social behaviors are the precursors of human morality. They further believe that if morality grew out of behavioral rules shaped by evolution, it is for biologists, not philosophers or theologians, to say what these rules are.

Moral philosophers do not take very seriously the biologists’ bid to annex their subject, but they find much of interest in what the biologists say and have started an academic conversation with them.

The original call to battle was sounded by the biologist Edward O. Wilson more than 30 years ago, when he suggested in his 1975 book “Sociobiology” that “the time has come for ethics to be removed temporarily from the hands of the philosophers and biologicized.” He may have jumped the gun about the time having come, but in the intervening decades biologists have made considerable progress.

Last year Marc Hauser, an evolutionary biologist at Harvard, proposed in his book “Moral Minds” that the brain has a genetically shaped mechanism for acquiring moral rules, a universal moral grammar similar to the neural machinery for learning language. In another recent book, “Primates and Philosophers,” the primatologist Frans de Waal defends against philosopher critics his view that the roots of morality can be seen in the social behavior of monkeys and apes.


The Irish Times
March 17, 2007

"www.edge.org has established itself as a major force on the intellectual scene in the US"

Reading room: a surfers' guide

The Dublin Review of Books will boast a regular blog where readers can carry on live discussion of particular articles or topics between issues.

But it isn't the only online magazine vying for the attention of literary audiences - there are dozens of sassy outfits out there, each with its own distinctive perks and quirks. ...

www.edge.org has established itself as a major force on the intellectual scene in the US and as required reading for humanities heads who want to keep up to speed with the latest in science and technology. Current debates on the site feature stellar contributors Noam Chomsky, Scott Atran and Daniel C Dennett.


Science 16 March 2007:
Vol. 315. no. 5818, pp. 1486 - 1487
News Focus


Ocean Study Yields a Tidal Wave of Microbial DNA

John Bohannon

Data glut or unprecedented science? A global hunt for marine microbial diversity turns up a vast, underexplored world of genes, proteins, and "species"

After relishing the role of David to the Human Genome Project's Goliath, J. Craig Venter is now positioning himself as a Charles Darwin of the 21st century. Darwin's voyage aboard the H.M.S. Beagle 170 years ago to the Galápagos Islands netted a plethora of observations—the bedrock for his theory of evolution. Four years ago, Venter set sail for the same islands and returned 9 months later with his own cache of data—billions of bases of DNA sequence from the ocean's microbial communities. But whether that trip will prove anything more than a fishing expedition remains to be seen.

On 13 March, Venter, head of the J. Craig Venter Institute in Rockville, Maryland, and a bevy of co-authors rolled out 7.7 million snippets of sequence, dubbed the Global Ocean Sampling, in a trio of online papers in PLoS Biology. As a first stab at mining these data, which have just become publicly available to other scientists, Venter's team has found evidence of so many new microbial species that the researchers want to redraw the tree of microbial life. They have also translated the sequences into hypothetical proteins and made some educated guesses about their possible functions.

Some scientists are wowed by the effort. Others worry that researchers will not be able to make sense of all this information. The diversity of microbes uncovered is "overwhelming, … tantamount to trying to understand the plot of a full-length motion picture after looking at a single frame of the movie," says Mitch Sogin, a molecular evolutionary biologist at the Marine Biological Laboratory in Woods Hole, Massachusetts. And Venter doesn't necessarily disagree. In 2004, as the data were first rolling in, Venter confidently predicted that his salty DNA survey would "provide a different view of evolution." To make that happen, however, he now says, "we need even more data." ...


March 15, 2007


60-SECOND SYNOPSIS READING JUDAS By Elaine Pagels and Karen L. King; 198 pages

All About The Gospel of Judas

A year ago, scholars got access to a bizarre document from about A.D. 150 called The Gospel of Judas. In it, Jesus promises Judas heaven for turning him in and maligns the rest of the Apostles for sacrificing their followers.

Princeton's Pagels and Harvard's King try to decipher the document. Why reward Judas? Because Jesus' death helps prove what the Gospel writer thought was Christ's real message: that his—and our—true essence is not flesh but immortal spirit. And the text bad-mouths other disciples as an indirect way of attacking 2nd century Christian bishops who encouraged believers to be martyrs.

The authors suggest the text was a polemic—and a losing one, since martyrdom became a pillar of the church. But its angry tone supports a favorite theme of Pagels': that not all early believers embraced doctrines now accepted as handed down directly from Jesus.


Fresh Air
March 14, 2007

The Gospel of Judas and the Shaping of Christianity'

Religion scholars Elaine Pagels and Karen King's new book, Reading Judas: The Gospel of Judas and the Shaping of Christianity, interprets and translates the recently discovered gnostic gospel of Judas.

This is FRESH AIR. I'm Terry Gross.

My guests Elaine Pagels and Karen King are scholars of the Gnostic Gospels, the gospels that were excluded from the New Testament and offer alternative views of the life of Jesus and early Christianity. Pagels and King collaborated on the new book, "Reading Judas," a translation and interpretation of the recently published gospel of Judas. Pagels is a professor of religion at Princeton University and is the author of several books, including "The Gnostic Gospels," which won a National Book Critics Circle Award and National Book Award. King is a professor at Harvard Divinity School and the author of a book about the gospel of Mary. The gospel of Judas was discovered in the '70s and published last year by the National Geographic Society. In this gospel, which offers a radically different version of Judas' relationship to Jesus, Judas is Jesus' favorite disciple. Although the gospel is ascribed to Judas, it was written about 150 years after the death of Jesus. Pagels says this gospel opens a window on the disputes of second century Christians about the meaning of Judas' betrayal of Jesus and the meaning of Jesus' teachings.

Ms. ELAINE PAGELS: What we're showing in this gospel is how stories about Jesus were told by people who began to tell them and write them down. This is in the New Testament and outside the New Testament, and what they knew is that Judas had handed Jesus over to the people who arrested him. That's what they knew. That's what the earliest account said. And then later, people speculated, `Why did he do it? Why did he commit this crime,' as they saw it? And some said, `Well, he did it for greed. He did it for money.' The gospel of Judas says he did it because Jesus asked him to.

......Ms. KING: You know, Terry, many of your readers I think when they hear this are going to think of it as revisionist, are going to call it revisionist. But I would say what they need to understand first of all is that the story that we have now is from the side of those who won. You know, the winners get to tell history, and what we're doing is not revising history. What we're doing is filling it out. What these new texts are giving us are voices from early Christians that allow us now to hear many sides of the debates and struggles, the experiences that Christians were undergoing in this period. So it is a fuller and richer picture of what was going on in this early Christian movement, and the gospel of Judas gives us one kind of voice. We had not really had voices before that allowed us to hear Christians objecting to the heroization of martyrs.


Skeptical Inquirer
March 1, 2007

Intellectual and creative magnificence

What We Believe But Cannot Prove: Today's Leading Thinkers in the Age of Uncertainty;
Book review by Kenneth W. Krause

Those who wonder what cutting-edge scientists might ponder outside of their classrooms and laboratories need wonder no more. In What We Believe But Cannot Prove, "intellectuals in action" speculate on the frontiers of science, both hard and soft. Skeptics, however, should not be deceived by the title. An ample majority of the more than 100 teasingly short essays included will sate the intellect's appetite for both facts and reasoned theory. John Brockman's new collection features the world's most celebrated and respected scientists and their musings on everything from human pre-history to cosmology and astrophysics, from evolution to extraterrestrial intelligence, and from genetics to theories of consciousness. ....

...What We Believe But Cannot Prove offers an impressive array of insights and challenges that will surely delight curious readers, generalists and specialists alike. Science is intimidating for the vast majority of us. But John Brockman has grown deservedly famous in recent years for his ability to lure these disciplines and theirleading practitioners back to Earth where terrestrials are afforded all-too-rare opportunities to marvel at the intellectual and creativemagnificence of science in particular, and at our species' immeasurable potential in all pursuits more generally.

Physics World
March 1, 2007

Intellectual innovator

Former physicist Nathan Myhrvold has been many things — from Bill Gates' right-hand man to the world champion of barbecue. He tells Martin Griffiths how he is now hoping to change the way the world invents

Nathan Myhrvold has only ever worked for three people: first Stephen Hawking; then Steve Ballmer, current chief executive of Microsoft; and finally Bill Gates. "Each had more money and less formal education than the last," jokes Myhrvold, "so I had nowhere left to go." Maybe that is part of the reason that Myhrvold, a former physicist, left his job as chief strategist and head of technology at Microsoft in 2000 and set out on his own to revolutionize the intellectual property market.

Myhrvold's company, Intellectual Ventures (IV), does not make anything. Instead, it trades in ideas, specifically patents. Myhrvold has recruited a team of about 30 "senior inventors" to think for him — scientists, technologists and business leaders whom he thinks have not "sold all of their brain" in their day jobs. By bringing these talented people from diverse fields together for "invention sessions" at lV's offices in Bellevue, Washington, Myhrvold hopes to stimulate new ideas that his team of lawyers can patent.

The approach seems to be working:

IV files about 400 patents each year based on the ideas of its inventors. The model for making money from these patents varies. For what Myhrvold calls "evolutionary ideas" - incremental progress in areas that already have a large market - IV will usually license its patents to existing companies. But Myhrvold is more excited about "revolutionary ideas", those that are at least five years away from commercialization and so might be more suited to new spin-out companies. For their part, the inventors are paid for their time, and offered a share of any profits resulting from their invention. ...

...Myhrvold also funds research into dinosaur paleontology, and even does some research on the topic on the side. In 2000 he had a paper published in Nature on his co-discovery of a bird-like tail bone from and dynamics of molecules within a non-avian dinosaur in Mongolia.  Most people's hobbies do not end up being published in leading scientific journals, but then Myhrvold makes a habit of excelling.

A case in point is his passion for cookery, which led to him training as French chef, working part-time in a top Seattle restaurant and winning the barbecue world championship in Memphis in 1991. Myhrvold is even working on a cookbook that will include a section on "physics for chefs", which will cover using the heat-diffusion equation to explain how quickly a steak cooks.

With all these projects on the boil, Myhrvold admits that he rarely has time to sit back and relax. But his restless enthusiasm seems unlikely to pause for breath any time soon. As Pendry remarks in passing, "Have you asked him about his plan to salvage three new obelisks from the Nile?"...

March 19, 2007

Our Books, Ourselves
Baby boomers and their books—it's a love story where nobody ever said he was sorry. Except, perhaps, for 'Love Story' itself.
By Malcolm Jones

...The best memoirists are cranks at heart. They don't want anyone else telling them how it all happened. They want to do the telling. They're like some old coot down in the basement building a car from scratch. And what is that if not a boomer trait, maybe the best of all? ... Self-reliance, self-invention—these ideas are as old as the republic. But in their art—from rock to graphic novels—the boomers took these concepts about as far as any generation has. There are few, if any, schools of writing, just lots of individualist writers going their own way.

In that light, if I had to nominate one book to stand for my generation, it wouldn't be a novel, or a memoir, or a graphic novel. It would be The Whole Earth Catalog. First published in 1968, the brainchild of Stewart Brand went through many subsequent editions. In it you could find information on raising goats, building a geodesic dome—just about anything. It was the first place I heard of the architectural writing of Christopher Alexander, solar power, paleontologist Gregory Bateson, the tools of Smith & Hawken and the excellent novelist Gurney Norman, whose "Divine Right's Trip" was first published on every other page of the first edition of the catalog. The catalog's subtitle was "Access to Tools," and the first lesson it taught me was that a book is a kind of tool, a thing you use to learn with.

It is, at last, out of print, but that fact belies this ultimate baby-boomer bible's profound influence on the culture—not the counterculture but the whole culture. In a way, you could say that the catalog put itself out of business, because it so successfully anticipated the way we currently gain access to information—to almost everything, really. Not coincidentally, Brand was an early fan of the computer and the Internet. The way I searched for information in the pages of that counterculture wishbook, one reference leading to another, in an endless chain of influence, is almost exactly the way I use a computer. The Whole Earth Catalog was just the first, and very successful, prototype of a search engine.

"We are as gods, and might as well get good at it," Brand wrote in the first edition of The Whole Earth Catalog. When it comes to books, I think we took his advice. ...