EDGE 1 — December 21, 1996


THE THIRD CULTURE

"SCIENCE, DELUSION AND THE APPETITE FOR WONDER"
A Talk By Richard Dawkins

THE REALITY CLUB
Responses to "Science, Delusion and the Appetite for Wonder": Murray Gell-Mann, Milford Wolpoff, Reuben Hersh, Karl Sabbagh, Duncan Steele, Stanislas DeHaene, Joseph Ledoux, Margie Profet, Paul Davies, Robert Shapiro, Carl Djerassi

"Spare Me Your Memes": Jaron Lanier debates Charles Simonyi and Mike Godwin on the concept and value of Memes


(17,400 words)


John Brockman, Editor and Publisher | Kip Parent, Webmaster


THE THIRD CULTURE



"SCIENCE, DELUSION AND THE APPETITE FOR WONDER"
A Talk By Richard Dawkins
Introduction by John Brockman


The universe is changing in time, and it has evolved from something simpler to something more complex. That is the lesson to be learned from recent advances in evolutionary theory; the emergence of order has colored biology since Darwin and twentieth-century cosmology alike.

In Darwin's day, the exact manner of the inheritance of characteristics was not known; Darwin himself believed that certain characteristics were acquired by an organism as a result of environmental change and could be passed to the organism's offspring, an idea popularized by the French naturalist Jean-Baptiste Lamarck. In 1900, the work done by Mendel some fifty years earlier was brought to light, and the gene, though its exact nature was unknown at the time, became a player in "the modern synthesis" of Mendel and Darwin. This synthesis, which reconciled genetics per se with Darwin's vision of natural selection, was carried out in the early 1930s by R.A. Fisher, J.B.S. Haldane, and Sewall Wright, and augmented a few years later by the work of the paleontologist George Gaylord Simpson, the biologist Ernst Mayr, and the geneticist Theodosius Dobzhansky, who expanded on this neo-Darwinian paradigm. Nevertheless, there is still discord in the ranks of evolutionary biologists. The principal debates are concerned with the mechanism of speciation; whether natural selection operates at the level of the gene, the organism, or the species, or all three; and also with the relative importance of other factors, such as natural catastrophes.

Richard Dawkins is firmly in the ultra-Darwinist camp. "It rapidly became clear to me," he says, "that the most imaginative way of looking at evolution, and the most inspiring way of teaching it, was to say that it's all about the genes. It's the genes that, for their own good, are manipulating the bodies they ride about in. The individual organism is a survival machine for its genes."

Dawkins is an evolutionary biologist and the Charles Simonyi Professor For The Understanding Of Science at Oxford University; Fellow of New College; author of The Selfish Gene (1976, 2d ed. 1989), The Extended Phenotype (1982), The Blind Watchmaker (1986), River out of Eden (1995), and Climbing Mount Improbable (1996). He is a gifted writer, who is known for his popularization of Darwinian ideas as well as for original thinking on evolutionary theory. He has invented telling metaphors that illuminate the Darwinian debate: His book The Selfish Gene argues that genes—molecules of DNA—are the fundamental units of natural selection, the "replicators." Organisms, including ourselves, are "vehicles," the packaging for "replicators." The success or failure of replicators is based on their ability to build successful vehicles. There is a complementarity in the relationship: vehicles propagate their replicators, not themselves; replicators make vehicles. In The Extended Phenotype, he goes beyond the body to the family, the social group, the architecture, the environment that animals create, and sees these as part of the phenotype—the embodiment of the genes. He also takes a Darwinian view of culture, exemplified in his invention of the "meme," the unit of cultural inheritance; memes are essentially ideas, and they, too, are operated on by natural selection.

Richard Dawkins enjoys the high regard of his peers both for his writing and his thinking. Sir John Maddox, editor emeritus of Nature, notes that "Climbing Mount Improbable has the grandeur of Darwin's Origin of the Species, but that's not surprising—it covers the same ground. Nobody can look at this book and then put it down unread—and nobody who reads it can fail to understand what Darwin is all about." According to Danny Hillis, "notions like selfish genes, memes, and extended phenotype are powerful and exciting. They make me think differently. Unfortunately, I spend a lot of time arguing against people who have overinterpreted these ideas. They're too easily misunderstood as explaining more than they do. So you see, this Dawkins is a dangerous guy. Like Marx. Or Darwin."

In his role as the Charles Simonyi Professor For The Understanding Of Science at Oxford University, Dawkins regularly talks to the public regarding his views on the wonders of science. Several weeks ago, on November 12th, 1996, he delievered the Richard Dimbleby Lecture on BBC1 Television in England, entitled "Science, Delusion and the Appetite for Wonder." The complete text appears below.-

JB


"SCIENCE, DELUSION AND THE APPETITE FOR WONDER"
A Talk By Richard Dawkins

You could give Aristotle a tutorial. And you could thrill him to the core of his being. Aristotle was an encyclopedic polymath, an all time intellect. Yet not only can you know more than him about the world. You also can have a deeper understanding of how everything works. Such is the privilege of living after Newton, Darwin, Einstein, Planck, Watson, Crick and their colleagues.

I'm not saying you're more intelligent than Aristotle, or wiser. For all I know, Aristotle's the cleverest person who ever lived. That's not the point. The point is only that science is cumulative, and we live later.

Aristotle had a lot to say about astronomy, biology and physics. But his views sound weirdly naive today. Not as soon as we move away from science, however. Aristotle could walk straight into a modern seminar on ethics, theology, political or moral philosophy, and contribute. But let him walk into a modern science class and he'd be a lost soul. Not because of the jargon, but because science advances, cumulatively.

Here's a small sample of the things you could tell Aristotle, or any other Greek philosopher. And surprise and enthral them, not just with the facts themselves but with how they hang together so elegantly.

The earth is not the centre of the universe. It orbits the sun — which is just another star. There is no music of the spheres, but the chemical elements, from which all matter is made, arrange themselves cyclically, in something like octaves. There are not four elements but about 100. Earth, air, fire and water are not among them.

Living species are not isolated types with unchanging essences. Instead, over a time scale too long for humans to imagine, they split and diverge into new species, which then go on diverging further and further. For the first half of geological time our ancestors were bacteria. Most creatures still are bacteria, and each one of our trillions of cells is a colony of bacteria. Aristotle was a distant cousin to a squid, a closer cousin to a monkey, a closer cousin still to an ape (strictly speaking, Aristotle was an ape, an African ape, a closer cousin to a chimpanzee than a chimp is to an orangutan).

The brain is not for cooling the blood. It's what you use to do your logic and your metaphysics. It's a three dimensional maze of a million million nerve cells, each one drawn out like a wire to carry pulsed messages. If you laid all your brain cells end to end, they'd stretch round the world 25 times. There are about 4 million million connections in the tiny brain of a chaffinch, proportionately more in ours.

Now, if you're anything like me, you'll have mixed feelings about that recitation. On the one hand, pride in what Aristotle's species now knows and didn't then. On the other hand an uneasy feeling of, "Isn't it all a bit complacent? What about our descendants, what will they be able to tell us?"

Yes, for sure, the process of accumulation doesn't stop with us. 2000 years hence, ordinary people who have read a couple of books will be in a position to give a tutorial to today's Aristotles: to Francis Crick, say, or Stephen Hawking. So does this mean that our view of the universe will turn out to be just as wrong?

Let's keep a sense of proportion about this! Yes, there's much that we still don't know. But surely our belief that the earth is round and not flat, and that it orbits the sun, will never be superseded. That alone is enough to confound those, endowed with a little philosophical learning, who deny the very possibility of objective truth: those so-called relativists who see no reason to prefer scientific views over aboriginal myths about the world.

Our belief that we share ancestors with chimpanzees, and more distant ancestors with monkeys, will never be superseded although details of timing may change. Many of our ideas, on the other hand, are still best seen as theories or models whose predictions, so far, have survived the test. Physicists disagree over whether they are condemned forever to dig for deeper mysteries, or whether physics itself will come to an end in a final 'theory of everything', a nirvana of knowledge. Meanwhile, there is so much that we don't yet understand, we should loudly proclaim those things that we do, so as to focus attention on problems that we should be working on.

Far from being over-confident, many scientists believe that science advances only by disproof of its hypotheses. Konrad Lorenz said he hoped to disprove at least one of his own hypotheses every day before breakfast. That was absurd, especially coming from the grand old man of the science of ethology, but it is true that scientists, more than others, impress their peers by admitting their mistakes.

A formative influence on my undergraduate self was the response of a respected elder statesmen of the Oxford Zoology Department when an American visitor had just publicly disproved his favourite theory. The old man strode to the front of the lecture hall, shook the American warmly by the hand and declared in ringing, emotional tones: "My dear fellow, I wish to thank you. I have been wrong these fifteen years." And we clapped our hands red. Can you imagine a Government Minister being cheered in the House of Commons for a similar admission? "Resign, Resign" is a much more likely response!

Yet there is hostility towards science. And not just from the green ink underlining brigade, but from published novelists and newspaper columnists. Newspaper columns are notoriously ephemeral, but their drip drip, week after week, or day after day, repetition gives them influence and power, and we have to notice them. A peculiar feature of the British press is the regularity with which some of its leading columnists return to attack science — and not always from a vantage point of knowledge. A few weeks ago, Bernard Levin's effusion in The Times was entitled "God, me and Dr Dawkins" and it had the subtitle: "Scientists don't know and nor do I — but at least I know I don't know".

It is no mean task to plumb the full depths of what Mr Bernard Levin does not know, but here's an illustration of the gusto with which he boasts of it.

"Despite their access to copious research funds, today's scientists have yet to prove that a quark is worth a bag of beans. The quarks are coming! The quarks are coming! Run for your lives . . .! Yes, I know I shouldn't jeer at science, noble science, which, after all, gave us mobile telephones, collapsible umbrellas and multi-striped toothpaste, but science really does ask for it . . . Now I must be serious. Can you eat quarks? Can you spread them on your bed when the cold weather comes?"

It doesn't deserve a reply, but the distinguished Cambridge scientist, Sir Alan Cottrell, wrote a brief Letter to the Editor:— "Sir: Mr Bernard Levin asks 'Can you eat quarks?' I estimate that he eats 500,000,000, 000,000, 000,000 quarks a day."

It has become almost a cliché to remark that nobody boasts of ignorance of literature, but it is socially acceptable to boast ignorance of science and proudly claim incompetence in mathematics. In Britain, that is. I believe the same is not true of our more successful economic competitors, Germany, the United States and Japan.

People certainly blame science for nuclear weapons and similar horrors. It's been said before but needs to be said again: if you want to do evil, science provides the most powerful weapons to do evil; but equally, if you want to do good, science puts into your hands the most powerful tools to do so. The trick is to want the right things, then science will provide you with the most effective methods of achieving them.

An equally common accusation is that science goes beyond its remit. It's accused of a grasping take-over bid for territory that properly belongs to other disciplines such as theology. On the other hand — you can't win! — listen to the novelist Fay Weldon's hymn of hate against 'the scientists' in The Daily Telegraph.

"Don't expect us to like you. You promised us too much and failed to deliver. You never even tried to answer the questions we all asked when we were six. Where did Aunt Maud go when she died? Where was she before she was born? . . . And who cares about half a second after the Big Bang; what about half a second before? And what about crop circles?"

More than some of my colleagues, I am perfectly happy to give a simple and direct answer to both those Aunt Maud questions. But I'd certainly be called arrogant and presumptuous, going beyond the limits of science.

Then there's the view that science is dull and plodding, with rows of biros in its top pocket. Here's another newspaper columnist, A A Gill, writing on science this year in The Sunday Times.

"Science is constrained by experiment results and the tedious, plodding stepping stones of empiricism . . . What appears on television just is more exciting than what goes on in the back of it . . . That's art, luvvie: theatre, magic, fairy dust, imagination, lights, music, applause, my public. There are stars and there are stars, darling. Some are dull, repetitive squiggles on paper, and some are fabulous, witty, thought-provoking, incredibly popular . . ."

The 'dull, repetitive squiggles' is a reference to the discovery of pulsars in 1967, by Jocelyn Bell and Anthony Hewish. Jocelyn Bell Burnell had recounted on television the spine-tingling moment when, a young woman on the threshold of a career, she first knew she was in the presence of something hitherto unheard-of in the universe. Not something new under the sun, a whole new KIND of sun, which rotates, so fast that, instead of taking 24 hours like our planet, it takes a quarter of a second. Darling, how too plodding, how madly empirical my dear!

Could science just be too difficult for some people, and therefore seem threatening? Oddly enough, I wouldn't dare to make such a suggestion, but I am happy to quote a distinguished literary scholar, John Carey, the present Merton Professor of English at Oxford:

"The annual hordes competing for places on arts courses in British universities, and the trickle of science applicants, testify to the abandonment of science among the young. Though most academics are wary of saying it straight out, the general consensus seems to be that arts courses are popular because they are easier, and that most arts students would simply not be up to the intellectual demands of a science course."

My own view is that the sciences can be intellectually demanding, but so can classics, so can history, so can philosophy. On the other hand, nobody should have trouble understanding things like the circulation of the blood and the heart's role in pumping it round. Carey quoted Donne's lines to a class of 30 undergraduates in their final year reading English at Oxford:

"Knows't thou how blood, which to the heart doth flow,
Doth from one ventricle to the other go?"

Carey asked them how, as a matter of fact, the blood does flow. None of the thirty could answer, and one tentatively guessed that it might be 'by osmosis'. The truth — that the blood is pumped from ventricle to ventricle through at least 50 miles of intricately dissected capillary vessels throughout the body — should fascinate any true literary scholar. And unlike, say, quantum theory or relativity, it isn't hard to understand. So I tender a more charitable view than Professor Carey. I wonder whether some of these young people might have been positively turned off science.

Last month I had a letter from a television viewer who poignantly began: "I am a clarinet teacher whose only memory of science at school was a long period of studying the Bunsen burner." Now, you can enjoy the Mozart concerto without being able to play the clarinet. You can be a discerning and informed concert critic without being able to play a note. Of course music would come to a halt if nobody learned to play it. But if everybody left school thinking you had to play an intrument before you could appreciate music, think how impoverished many lives would be.
Couldn't we treat science in the same way? Yes, we must have Bunsen burners and dissecting needles for those drawn to advanced scientific practice. But perhaps the rest if us could have separate classes in science appreciation, the wonder of science, scientific ways of thinking, and the history of scientific ideas, rather than laboratory experience.

It's here that I'd seek rapprochement with another apparent foe of science, Simon Jenkins, former editor of The Times and a much more formidable adversary than the other journalists I've quoted, because he has some knowledge of what he is talking about. He resents compulsory science education and he holds the idiosyncratic view that it isn't useful. But he is thoroughly sound on the uplifting qualities of science. In a recorded conversation with me, he said:

"I can think of very few science books I've read that I've called useful. What they've been is wonderful. They've actually made me feel that the world around me is a much fuller . . . much more awesome place than I ever realised it was . . . I think that science has got a wonderful story to tell. But it isn't useful. It's not useful like a course in business studies or law is useful, or even a course in politics and economics."

Far from science not being useful, my worry is that it is so useful as to overshadow and distract from its inspirational and cultural value. Usually even its sternest critics concede the usefulness of science, while completely missing the wonder. Science is often said to undermine our humanity, or destroy the mystery on which poetry is thought to thrive. Keats berated Newton for destroying the poetry of the rainbow.

"Philosophy will clip an Angel's wings,
Conquer all mysteries by rule and line,
Empty the haunted air, and gnomed mine —
Unweave a rainbow . . ."

Keats was, of course, a very young man.

Blake, too, lamented:

"For Bacon and Newton, sheath'd in dismal steel, their terrors hang Like iron scourges over Albion; Reasonings like vast Serpents Infold around my limbs . . ."

I wish I could meet Keats or Blake to persuade them that mysteries don't lose their poetry because they are solved. Quite the contrary. The solution often turns out more beautiful than the puzzle, and anyway the solution uncovers deeper mystery. The rainbow's dissection into light of different wavelengths leads on to Maxwell's equations, and eventually to special relativity.

Einstein himself was openly ruled by an aesthetic scientific muse: "The most beautiful thing we can experience is the mysterious. It is the source of all true art and science", he said. It's hard to find a modern particle physicist who doesn't own to some such aesthetic motivation. Typical is John Wheeler, one of the distinguished elder statesmen of American physics today:

" . . . we will grasp the central idea of it all as so simple, so beautiful, so compelling that we will all say each to the other, 'Oh, how could it have been otherwise! How could we all have been so blind for so long!'"

Wordsworth might have understood this better than his fellow romantics. He looked forward to a time when scientific discoveries would become "proper objects of the poet's art". And, at the painter Benjamin Haydon's dinner of 1817, he endeared himself to scientists, and endured the taunts of Keats and Charles Lamb, by refusing to join in their toast: "Confusion to mathematics and Newton".

Now, here's an apparent confusion: T H Huxley saw science as "nothing but trained and organized common sense", while Professor Lewis Wolpert insists that it's deeply paradoxical and surprising, an affront to commonsense rather than an extension of it. Every time you drink a glass of water, you are probably imbibing at least one atom that passed through the bladder of Aristotle. A tantalisingly surprising result, but it follows by Huxley-style organized common sense from Wolpert's observation that "there are many more molecules in a glass of water than there are glasses of water in the sea".

Science runs the gamut from the tantalisingly surprising to the deeply strange, and ideas don't come any stranger than Quantum Mechanics. More than one physicist has said something like: "If you think you understand quantum theory, you don't understand quantum theory."

There is mystery in the universe, beguiling mystery, but it isn't capricious, whimsical, frivolous in its changeability. The universe is an orderly place and, at a deep level, regions of it behave like other regions, times behave like other times. If you put a brick on a table it stays there unless something lawfully moves it, even if you meanwhile forget it's there. Poltergeists and sprites don't intervene and hurl it about for reasons of mischief or caprice. There is mystery but not magic, strangeness beyond the wildest imagining, but no spells or witchery, no arbitrary miracles.

Even science fiction, though it may tinker with the laws of nature, can't abolish lawfulness itself and remain good science fiction. Young women don't take off their clothes and spontaneously morph themselves into wolves. A recent television drama is fairytale rather than science fiction, for this reason. It falls foul of a theoretical prohibition much deeper than the philosopher's "All swans are white — until a black one turns up" inductive reasoning. We know people can't metamorphose into wolves, not because the phenomenon has never been observed — plenty of things happen for the first time — but because werewolves would violate the equivalent of the second law of thermodynamics. Of this, Sir Arthur Eddington said.

"If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation."

To pursue the relationship between werewolves and entropy would take me too far afield. But, since this lecture commemorates a man whose integrity and honesty as a broadcaster is still an abiding legend 30 years after his death, I'll stay for a moment with the current epidemic of paranormal propaganda on television.

In one popular type of programming, conjurers come on and do routine tricks. But instead of admitting that they are conjurers, these television performers claim genuinely supernatural powers. In this they are abetted by prestigious, even knighted, presenters, people whom we have got into the habit of trusting, broadcasters who have become role models. It is an abuse of what might be called the Richard Dimbleby Effect.

In other programmes, disturbed people recount their fantasies of ghosts and poltergeists. But instead of sending them off to a kindly psychiatrist, television producers eagerly hire actors to re-create their delusions — with predictable effects on the credulity of large audiences.

Recently, a faith healer was given half an hour of free prime time television, to advertise his bizarre claim to be a 2000 year-dead physician called Paul of Judea. Some might call this entertainment, comedy even, though others would find it objectionable entertainment, like a fairground freak show.

Now I obviously have to return to the arrogance problem. How can I be so sure that this ordinary Englishman with an unlikely foreign accent was not the long dead Paul of Judea? How do I know that astrology doesn't work? How can I be so confident that the television 'supernaturalists' are ordinary conjurers, just because ordinary conjurers can replicate their tricks? (spoonbending, by the way, is so routine a trick that the American conjurers Penn and Teller have posted instructions for doing it on the Internet! See http://www.randi.org/jr /ptspoon.html).

It really comes down to parsimony, economy of explanation. It is possible that your car engine is driven by psychokinetic energy, but if it looks like a petrol engine, smells like a petrol engine and performs exactly as well as a petrol engine, the sensible working hypothesis is that it is a petrol engine. Telepathy and possession by the spirits of the dead are not ruled out as a matter of principle. There is certainly nothing impossible about abduction by aliens in UFOs. One day it may be happen. But on grounds of probability it should be kept as an explanation of last resort. It is unparsimonious, demanding more than routinely weak evidence before we should believe it. If you hear hooves clip-clopping down a London street, it could be a zebra or even a unicorn, but, before we assume that it's anything other than a horse, we should demand a certain minimal standard of evidence.

It's been suggested that if the supernaturalists really had the powers they claim, they'd win the lottery every week. I prefer to point out that they could also win a Nobel Prize for discovering fundamental physical forces hitherto unknown to science. Either way, why are they wasting their talents doing party turns on television?

By all means let's be open-minded, but not so open-minded that our brains drop out. I'm not asking for all such programmes to be suppressed, merely that the audience should be encouraged to be critical. In the case of the psychokineticists and thought-readers, it would be good entertainment to invite studio audiences to suggest critical tests, which only genuine psychics, but not ordinary conjurers, could pass. It would make a good, entertaining form of quiz show.

How do we account for the current paranormal vogue in the popular media? Perhaps it has something to do with the millennium — in which case it's depressing to realise that the millennium is still three years away. Less portentously, it may be an attempt to cash in on the success of The X-Files. This is fiction and therefore defensible as pure entertainment.

A fair defence, you might think. But soap operas, cop series and the like are justly criticised if, week after week, they ram home the same prejudice or bias. Each week The X-Files poses a mystery and offers two rival kinds of explanation, the rational theory and the paranormal theory. And, week after week, the rational explanation loses. But it is only fiction, a bit of fun, why get so hot under the collar?

Imagine a crime series in which, every week, there is a white suspect and a black suspect. And every week, lo and behold, the black one turns out to have done it. Unpardonable, of course. And my point is that you could not defend it by saying: "But it's only fiction, only entertainment".
Let's not go back to a dark age of superstition and unreason, a world in which every time you lose your keys you suspect poltergeists, demons or alien abduction.

Enough, let me turn to happier matters. The popularity of the paranormal, oddly enough, might even be grounds for encouragement . I think that the appetite for mystery, the enthusiasm for that which we do not understand, is healthy and to be fostered. It is the same appetite which drives the best of true science, and it is an appetite which true science is best qualified to satisfy. Perhaps it is this appetite that underlies the ratings success of the paranormalists.

I believe that astrologers, for instance, are playing on — misusing, abusing — our sense of wonder. I mean when they hijack the constellations, and employ sub-poetic language like the moon moving into the fifth house of Aquarius. Real astronomy is the rightful proprietor of the stars and their wonder. Astrology gets in the way, even subverts and debauches the wonder.

To show how real astronomical wonder can be presented to children, I'll borrow from a book called Earthsearch by John Cassidy, which I brought back from America to show my daughter Juliet. Find a large open space and take a soccer ball to represent the sun. Put the ball down and walk ten paces in a straight line. Stick a pin in the ground. The head of the pin stands for the planet Mercury. Take another 9 paces beyond Mercury and put down a peppercorn to represent Venus. Seven paces on, drop another peppercorn for Earth. One inch away from earth, another pinhead represents the Moon, the furthest place, remember, that we've so far reached. 14 more paces to little Mars, then 95 paces to giant Jupiter, a ping-pong ball. 112 paces further, Saturn is a marble. No time to deal with the outer planets except to say that the distances are much larger. But, how far would you have to walk to reach the nearest star, Proxima Centauri? Pick up another soccer ball to represent it, and set off for a walk of 4200 miles. As for the nearest other galaxy, Andromeda, don't even think about it!

Who'd go back to astrology when they've sampled the real thing — astronomy, Yeats's "starry ways", his "lonely, majestical multitude"? The same lovely poem encourages us to "Remember the wisdom out of the old days" and I want to end with a little piece of wonder from my own territory of evolution.

You contain a trillion copies of a large, textual document written in a highly accurate, digital code, each copy as voluminous as a substantial book. I'm talking, of course, of the DNA in your cells. Textbooks describe DNA as a blueprint for a body. It's better seen as a recipe for making a body, because it is irreversible. But today I want to present it as something different again, and even more intriguing. The DNA in you is a coded description of ancient worlds in which your ancestors lived. DNA is the wisdom out of the old days, and I mean very old days indeed.

The oldest human documents go back a few thousand years, originally written in pictures. Alphabets seem to have been invented about 35 centuries ago in the Middle East, and they've changed and spawned numerous varieties of alphabet since then. The DNA alphabet arose at least 35 million centuries ago. Since that time, it hasn't change one jot. Not just the alphabet, the dictionary of 64 basic words and their meanings is the same in modern bacteria and in us. Yet the common ancestor from whom we both inherited this precise and accurate dictionary lived at least 35 million centuries ago.

What changes is the long programs that natural selection has written using those 64 basic words. The messages that have come down to us are the ones that have survived millions, in some cases hundreds of millions, of generations. For every successful message that has reached the present, countless failures have fallen away like the chippings on a sculptor's floor. That's what Darwinian natural selection means. We are the descendants of a tiny élite of successful ancestors. Our DNA has proved itself successful, because it is here. Geological time has carved and sculpted our DNA to survive down to the present.

There are perhaps 30 million distinct species in the world today. So, there are 30 million distinct ways of making a living, ways of working to pass DNA on to the future. Some do it in the sea, some on land. Some up trees, some underground. Some are plants, using solar panels — we call them leaves — to trap energy. Some eat the plants. Some eat the herbivores. Some are big carnivores that eat the small ones. Some live as parasites inside other bodies. Some live in hot springs. One species of small worms is said to live entirely inside German beer mats. All these different ways of making a living are just different tactics for passing on DNA. The differences are in the details.

The DNA of a camel was once in the sea, but it hasn't been there for a good 300 million years. It has spent most of recent geological history in deserts, programming bodies to withstand dust and conserve water. Like sandbluffs carved into fantastic shapes by the desert winds, camel DNA has been sculpted by survival in ancient deserts to yield modern camels.

At every stage of its geological apprenticeship, the DNA of a species has been honed and whittled, carved and rejigged by selection in a succession of environments. If only we could read the language, the DNA of tuna and starfish would have 'sea' written into the text. The DNA of moles and earthworms would spell 'underground'. Of course all the DNA would spell many other things as well. Shark and cheetah DNA would spell 'hunt', as well as separate messages about sea and land.

We can't read these messages yet. Maybe we never shall, for their language is indirect, as befits a recipe rather than a reversible blueprint. But it's still true that our DNA is a coded description of the worlds in which our ancestors survived. We are walking archives of the African Pliocene, even of Devonian seas, walking repositories of wisdom out of the old days. You could spend a lifetime reading such messages and die unsated by the wonder of it.

We are going to die, and that makes us the lucky ones. Most people are never going to die because they are never going to be born. The potential people who could have been standing in my place but who will never see the light of day outnumber the sand grains of Sahara — more, the atoms in the universe. Certainly those unborn ghosts include greater poets than Donne, greater scientists than Newton, greater composers than Beethoven. We know this because the set of possible people allowed by our DNA so massively outnumbers the set of actual people. In the teeth of these stupefying odds it is you and I that are privileged to be here, privileged with eyes to see where we are and brains to wonder why.

There is an appetite for wonder, and isn't true science well qualified to feed it?

It's often said that people 'need' something more in their lives than just the material world. There is a gap that must be filled. People need to feel a sense of purpose. Well, not a BAD purpose would be to find out what is already here, in the material world, before concluding that you need something more. How much more do you want? Just study what is, and you'll find that it already is far more uplifting than anything you could imagine needing.

You don't have to be a scientist — you don't have to play the bunsen burner — in order to understand enough science to overtake your imagined need and fill that fancied gap. Science needs to be released from the lab into the culture.

Link: The Unofficial Dawkins Website



THE REALITY CLUB



Responses to "Science, Delusion and the Appetite for Wonder"

Murray Gell-Mann, Milford Wolpoff, Reuben Hersh, Karl Sabbagh, Duncan Steele, Stanislas DeHaene, Joseph Ledoux, Margie Profet, Paul Davies, Robert Shapiro, Carl Djerassi



Current number of posts: 11

Post: 1 Submitted: 12-22-96
From: Murray Gell-Mann

I enjoyed reading on my email the piece by Richard Dawkins on the present state of siege in which science finds itself, under attack by all sorts of silly people with different silly agendas. Its interesting though that we are not alone. It is not only science that is under attack, in fact any sort of expertise is resented, such as the expertise of the historian. Of course I like to include history among the sciences as it is regularly included in, say, Russia. But if we go into the arts it's the same thing. Many people resent the expertise of artists, as well. Any implication that there is somebody who knows or understands more than the average person in some area or is capable of doing things better in some area than the average person is terribly resented. Although this is not true apparently of sports or of entertainment. But apart from these two areas, if people have expertise in something other than entertaining the public, it seems to provoke a lot of resentment today in many quarters. And so the attack on science should perhaps not be viewed separately but included in the defense of expertise, and intellect generally.-

Murray Gell-Mann


Post: 2 Submitted: 12-22-96
From: Milford H. Wolpoff

I really think if we could give Aristotle a tutorial he would ring up the loony bin to get us committed-too much difference in world view and basic assumptions for him to every understand what we were talking about or why. Actually, Rachel Caspari and I talk quite a bit about this in Race and Human Evolution, and I hope you get a chance to read it.

Happy Holidays.


Post: 3 Submitted: 12-22-96
From: Reuben Hersh

Dawkins talk is rich with his eloquence and learning. From my perch, I would say also that he is on the side of the angels.

With his main shtick, "science is wonderful," I of course can but assent.

Therefore, I turn to my two bones to pick.

Bone number 1 is his amazing and absurd discovery that we are but devices created by our genes for their own survival. In an era when reductionism is generally being discredited and rejected, this is a piece of reductionism carried to fantastic new heights. Dawkins evidently admires Charles Darwin. Would he say that Darwin developed his theory of evolution for the sake of propagating his genes?

No doubt the same for Newton, Tolstoy, or Beethoven. We should think of them and their work in terms of genetics, not in terms of human consciousness.

Fifty years ago some eminent physicists liked to say that we are nothing but molecules, or nothing but atoms, or nothing but protons, electrons, and neutrons (this was before quarks.) That has gone out of style. Genetic or biological reductionism is just as foolish.

My second bone has to do with the talk as a whole. If I agree with his main point, that is no surprise, I am somewhat of a mathematician.

There is something called "preaching to the choir." It's a satisfying thing to do. You tell it like it is and your audience agrees with you. But when you're done, nothing much has changed. If Dawkins was preaching to the unconverted, I'm not sure how far he got with them. If he was preaching to the converted-great! What fun!

Reuben Hersh


Post: 4 Submitted: 12-22-96
From: Karl Sabbagh

I saw Dawkins lecture on TV and thought it was first class-and probably entirely ineffective at changing the views of those he targets. Even Bernard Levin, if he bothered to watch the lecture or read the text, would delight in ignoring or parodying Richard's style as yet another example of the arrogance of scientists and those who support the scientific method. And we are arrogant, if arrogance means not tolerating loose and ignorant thinking. And ignorance can operate in the most intelligent brain, if it has never bothered to understand what the scientific method is and that it should be applied to a far wider range of situations than the profession of scientific research. We need the Dawkins approach, but we also need to find new ways of shaming the people who really need to be brought down to earth-to the realities of science.

best karl


Post: 5 Submitted: 12-23-96
From: Duncan Steele

Re: the Dawkins' lecture. I thought it inspirational and effective-but then to me, it would be, wouldn't it? In fact I saw RD deliver the lecture on TV a few weeks back, whilst I was in London. It came over very well. I'd note that it was not transmitted in prime time, but what can one expect...? Of course, this is part of the problem that RD addressed.

That paragraph was meant to deliver some well-deserved praise before I make two criticisms. The first is to point out an error; the second is (perhaps) a matter of opinion.

RD wrote: "Perhaps it has something to do with the millennium-in which case it's depressing to realise that the millennium is still three years away. "

Oh, dear, RD has fallen victim to the popular delusion that the next century/millennium begins on 1st January 2000. In fact it is still four years (and a bit) until the start of the next millennium. No year zero, and all that stuff.

He also wrote: "Certainly those unborn ghosts include greater poets than Donne, greater scientists than Newton, greater composers than Beethoven. "

There are three bases to my objection to that sentence. The first is the simple value-judgment side of things: who is to say who was greater in any sphere?

The second stems from a query about what might disqualify anyone from consideration: should Newton be disqualified on the basis of his alchemic (and other) beliefs? Let me give an extreme example, to show the point: should Hitler be considered a great humanitarian because he was responsible for the introduction of the VW Beetle, still a workhorse in the 1st and 3rd Worlds?

The third comes from a recognition that none of these men came out of a vacuum; as the saying goes, "Cometh the hour, cometh the man" (or woman). The conditions were right, the time was right, for what they did. [Before someone says, "Yes, of course, Newton himself wrote that `If I have seen further it is by standing on the shoulders of giants'" I'll point out that in fact IN wrote those words to a hunch-backed dwarf, Robert Hooke]. In the context of RD's comment, then, I'd point out that almost certainly there have been many DNA combinations which have occurred in homo sapiens which-potentially-could have produced "greater" poets/scientists/whatever than Donne or Newton, but the conditions were never right for them to blossom. There are likely several around now, living in China, India, or the Bronx.

I left out Beethoven there since I have to admit that my reason for taking the time to write the above-and for halting in my reading of RD's lecture at the "offending" paragraph-was the mention of that indisputably great MAN. It is in my psychological makeup (I worded that so as to avoid writing "in my nature", as one usually would) to object to any statement which might hint even the merest inflection against Beethoven. Here, for example, it could be construed that RD's statement reduces Beethoven's greatness to the mere product of his chance combination of genes. That would be to ignore his psychological makeup, and how his character was shaped by the environment in which he existed, and his reaction to it. How could anyone who has read the Heiligenstadt Testament believe that Beethoven was a product solely of his genetic makeup?

Finally, since we are comparing (to some extent) scientific and artistic (in the broadest sense) creativity, let me make a statement to which some might take umbrage. In media interviews, I am often asked questions along the lines of whether scientists view themselves as similarly creative as artists. My answer is that they might well do, but that belief is (in my opinion) misfounded: because artists create something new (Beethoven's 7th did not exist before it entered his head), whereas "all" that scientists do is to reveal the secrets of the universe. Like solving a crossword puzzle, whether you believe that some deity constructed that puzzle (and put in some damned difficult clues) or not. It is a different pursuit, needing a form of imagination and creativity; but I don't think that it is the same type of creativity as that displayed by an "artist."

Having written that, I note that the pursuit of the Third Culture does not represent in itself, but the pursuit has a foot on the "other side": there is no intrinsically higher value to a piece of creative writing about art or life compared to one about science.

My thanks to Richard Dawkins for his fine lecture.

Kind regards,

Duncan Steel


Post: 6 Submitted: 12-29-96
From: Stanislas DeHaene

I thoroughly enjoyed reading Richard Dawkins's essay on your new website. Not only did it give me an "appetite for wonder", but also an craving thirst for more such exquisitely accessible presentations of scientific matters — I believe I'll be making frequent visits to your URL! It is a great honor to see my name mentioned amidst the prestigious figures of science that you are gathering at this site. I don't know if my own research would "thrill Aristotle to the core of his being", but I do appreciate the opportunity of giving it a try — do you plan to have him send comments from elysium.com???

I was thinking, do you know if Dawkins's essay has been translated into French? If not, maybe I could suggest it for publication in the journal La Recherche, which the rising star of scientific publications for the general public nowadays in France (the French equivalent of Scientific American). What do you think?

My book will be out on January 17th in France. It already generates quite a bit of excitement in the press... and quite a bit of anxiety for the author! I'll let you know how it all comes out.


Post: 7 Submitted: 1-3-97
From: Joseph Ledoux

Richard Dawkins is a wonderful spokesperson for science. Not having had the opportunity to hear his BBC lecture, it's hard to know what the impact was or might be. The written text though is an upbeat infomercial for why science is important. But these comments (mine) are being directed to the wrong people. Presumably most people reading this are themselves scientists or are strong sympathizers. For this reason, like most of the others who have commented on the talk, I can only find small points to pick on.

The small point I want to pick on has to do with the war between scientists and social relativists. My wife is an art critic. When we first met in the early 1980s, we found that we read completely different philosophers. I peered into some of the post-structuralist material she was reading, but found it mostly unreadable. In other words, I had the typical knee-jerk scientific reaction to it. In the intervening years, it seems that the tensions between social relativism and science have increased. I have to admit that I've not gone very deep into poststructural theory or other aspects of contemporary social relativism, but feel that there might well be something of value in it. For example, the notion that words are defined by their relation to other words (rather than to physical objects) is strikingly similar to the functionalist position in cognitive science that mental states are defined by their relation to other mental states (rather than their relation to the brain). My personal take on this is that there must be a neural coding of mental states so that the "language of the mind" is ultimately the language of the brain. This makes me a hopeless materialist, but one who accepts that some aspects of mind and behavior might be functionally or even socially determined by processes in and between brains. In other words, I believe that materialism accounts for functionalism and socio-cultural relativism (even if it doesn't yet explain them). Actually, most functionalists are themselves materialists in their own way, but I doubt many social relativists are. It seems to me that we scientists should try to see what the relativists have to say. I'm as guilty as anyone of ignoring them, but feel that it wouldn't hurt to look a little closer.

At the same time, I appreciate why Dawkins takes relativists to task in his lecture and why John Brockman does in The Third Culture. In dogmatically dismissing absolutes, relativists leave scientists little choice but to either dismiss them or attack back. But this, I think, may lead us too easily down the path of throwing out the baby with the bath water. We aren't yet ready for "cultural neuroscience" but we may someday be. Certainly we should be open to what others have to say, even if they are fundamentally against what we do. We don't want to train young people to ignore and even reject the contributions of any field, even if the contributions are at this point remote.

I'm not saying that Dawkins is doing any dismissing of anyone. I'm just saying that we need to be to distinguish our distaste for the relativists' antipathy for science from their intellectual contributions.


Post: 8 Submitted: 1-5-97
From: Margie Profet

Cheers to Dawkins for his wonderful lecture-I liked all of it. The quotes from those anti-science journalists baffle me. I wonder, are those people proud of themselves for mocking the inventiveness of others, for reducing the tremendous benefits of science and technology to "putting stripes in toothpaste." The first thing to come to mind when I read these quotes was the bold response of Darwin's "Bulldog" to Wilberforce's mockery of Darwin's brilliant theory of natural selection (wish I had the quote in front of me). Cheers also to the person who tallied up the number of quarks eaten per day by one of those silly slanderers of science."

In the US, it seems that anti-science attitudes usually take the form of "beware the evil scientist, the Dr. Frankenstein"-the unfortunate message in such movies as "Jurassic Park." There's a distrust and fear of science-I think most people realize that knowledge is power and that they're lacking in it when it comes to science. The message in these kind of movies that "we're overstepping our bounds in trying to tame nature" never seems to deliver the answer to the question "Well who put those bounds there, and how do you know what they are?"


Post: 9 Submitted: 1-6-97
From: Paul Davies

I greatly enjoyed reading Richard Dawkins' lecture, and I agree with almost all of it. Indeed, I have often expressed similar sentiments myself, especially in relation to the UK anti-science brigade, who were partly responsible for my decision to quit Britain and live in Australia. A couple of points I would like to endorse. Yes, science is a victim of its own success. Because it is so good at driving technology and generating wealth, its worth tends to be assessed in purely utilitarian terms. Yet science is also a cultural activity of the deepest significance. Martin Rees once made the point that Darwin's theory of evolution doesn't have many commercial applications, but few would deny its greatness or its significance. (Actually, it may have applications in modern biotechnology). It is good that we should know where we have come from and what our place is in nature.

I also liked Richard's point that in science there is "mystery but not magic". This is such an important point. Science demystifies the universe, but reveals something far more elegant and awesome than the crude antics of a cosmic magician. Finally, what are we to do about the rise and rise of pseudo-science and paranormal claptrap, without being accused of a scientistic conspiracy? Here is a drastic suggestion: If I set up a stall in downtown Adelaide and sell bottles of tap water for $100 each with the claim that they will cure baldness, keep the swimming pool free of algae and remove carpet stains, I will be jailed for fraud. If I claim to be able to bend metal without touching it, read your future, or call up your spirit guide (for a suitable fee) I am left alone by the authorities. Is this right? A London physicist I know once refused to pay his local taxes until the Council removed astrology from the list of evening classes. Perhaps peddlars of paranormal piddle should be compelled to attach a Government warning of the sort that cigarette manufacturers and investment fund managers use, along the lines of: "Scientific tests have consistently failed to provide any positive evidence that the claims made herein can be substantiated".

Best wishes,

Paul


Post: 10 Submitted: 1-6-97
From: Robert Shapiro

Richard Dawkins' talk provides a brilliant beginning. I enjoyed the penetrating way in which he presented the scientific world view and demolished the fatuous criticisms of science by several of its critics. Despite the brilliant efforts of Dawkins and other spokesmen such as the late Carl Sagan, the problem remains: students in the United states are shunning the study of science, and the media continues to celebrate the paranormal and "New Age" thinking. We need to discuss what new things might be done to ensure the survival and prosperity of science in our culture.

Best Regards,

Robert Shapiro


Post: 11 Submitted: 1-10-96
From: Carl Djerassi

In his recent comments, dawkins speaks briefly about science fiction and its posible use in transmitting scientific ideas to a general public.

May I call to his and your attention the (to me) much more relevant genre of "science-in-fiction" which is rarely used but ought to be propagated much more widely. I myself have been working on a tetralogy of novels in that genre. Two of these volumes have already appeared as Penguin-USA paperbacks ("Cantor's Dilemma" and "The Bourbaki Gambit"-the latter describing the invention of PCR); the third ("Menachem's Seed") will be out later this year in hardback, and the entire series will be available first in German translation by the time of the 1997 Frankfurt Book Fair with the publication of the final volume ("No") which utilizes the recently discovered multiple biological functions of nitric oxide (e.g. in penile erection) to describe the role of "biotech" companies..

May I refer you to my web page ( http://www.Djerassi.Com) for more elaboration on that topic.

Carl Djerassi
Stanford University



"Spare Me Your Memes"

Jaron Lanier debates Charles Simonyi and Mike Godwin on the concept and value of Memes


Current number of posts: 13

Post: 1 Submitted: 12-27-96
From: Mike Godwin

Dawkins's powerfully explanatory notion of memes seemed to me at first to have almost casually tossed off in a larger discussion of the dynamics of genetic evolution. Only later did I realize he'd given us a paradigm for understanding how ideas work in cultures, in mass media, and in the growth of knowledge.

It's also a paradigm that gives free-speech advocates some serious social questions to think about. Dawkins's concept of the meme — that discrete thought that propagates itself, sometimes virulently, through minds and cultures — forces us to abandon any defense of free speech based on the principle that "words can never hurt you." (Hint: they can hurt you.) Instead, we must defend freedom of expression even though it sometimes allows the spread of *harmful* ideas, because freedom is the only environment that consistently promotes the discovery or creation of the *beneficial* ones.

Together with Karl Popper and Gregory Bateson, whose thinking complements his, Dawkins has done much to shape how I think about the world. He's one scientist who reminds us why we used to call scientists "natural philosophers."


Post: 2 Submitted: 12-27-96
From: Jaron Lanier

To: Mike Godwin

Hey there Mike,

I just debated Richard Dawkins (it'll appear in Psychology Today, of all places). I'm no fan of memes, though I like Richard, and enjoy other aspects of his thinking. Here's a small part of an article I'm working on that concerns memes and many other ways that evolution is applied outside of genetics.

All the best,

Jaron

Spare me your memes

Biological evolution is a theory that explains the remarkable, creative long term effects of massive numbers of untimely (pre-reproductive) deaths, but it is somewhat immune to variations in the sources of genetic variation from which death culls. The current controversies between scientists studying evolution underline this point. Variation might take place without boundaries or favor, as Dawkins seems to suggest, or might be subject to mathematically predetermined paths, as biologists like Kaufman and Goodwin have proposed. In either case, evolution proceeds, through the mechanism of violence. That the theory of evolution can survive these unresolved controversies shows that it is really the culling and not the sowing that is the key mechanism.

The relative indifference of evolution to the source of variation makes it a poor metaphor for understanding creativity that takes place under the protection of civilization. That is one reason why the idea of the "meme" is misleading. The meme concept, first proposed by Richard Dawkins, is sometimes used to explain how ideas change, but also sometimes as an ideal for how ideas should change. Dennett, in "Darwin's Dangerous Idea" speaks of wishing to extinguish a meme that had infected the physicist Roger Penrose as if it were a freakish individual that should be subject to a eugenics campaign. If it weren't for the romance of evolution, "Memes" would just be a fancy way of pointing out that non-rigorous ideas are often subject to a popularity contest. One danger, however, in the meme idea is an equation of creativity with mental eugenics.

There are so many other things wrong with memes that it's hard to list them succinctly. Equating ideas and genes revives all the worst old wrong ideas about genetics. Ideas do everything genes can't. They can change and effect each other without any concern for species boundaries. They can pass along traits acquired during their "lifespans"- they don't have to wait for some sub-strata of genetic material to be selected for. The long-resolved struggle against these mistaken ideas about genes has been irritated into existence again by a stupid metaphor. It is as if Darwin had never existed.

The notion of memes is an affront to the idea that some ideas can be better than others. Ideas can be rigorous, so the notion of improvement has meaning. Genes, on the other hand, don't improve; they just adapt to local circumstance. And that adaptation is entirely non-intentional and so slow that we learn about it largely from fossils. Many kinds of ideas, on the other hand, can be definitively improved, and this can be done methodically and cumulatively, leading to exponential rates of change. People used to believe God thought the world into existence in just this way, in six days. Darwin's central insight was that genes are not like ideas.

Within civilization, nonetheless, are found pseudo-evolutionary processes, like business and the academic career track, in which competition is harnessed to produce excellence. These should not be understood to be true examples of evolution, though, because the genes of the losers are still passed on without diminution. Even their "memes' are passed on, for those who insist on subscribing to the concept. That is what defines a civilization. If civilization worked like evolution, it would be perfectly ordinary to burn library books that had not been read for a long time. In the real world, when libraries burn, civilizations crumble. Marxism provides a recent example. Ideas are only like memes at the moment when they are extinguished, as happened in the library at Alexandria, or, as might have happened if had he been successful, in Hitler's bonfires.


Post: 3 Submitted: 12-27-96
From: Mike Godwin To: Jaron Lanier

Jaron,

As you might expect, I disagree with a number of your arguments. Rather than express my disagreements in great detail, I'll just note some of them here, in a way that perhaps will help you as you further refine your side of the argument. Or perhaps not. It's late.

Biological evolution is a theory that explains the remarkable, creative long term effects of massive numbers of untimely (pre-reproductive) deaths, but it is somewhat immune to variations in the sources of genetic variation from which death culls.

If I understand you correctly here, you're saying that the power of evolutionary theory does not depend on any particular theory as to the source of variation. On that point I agree with you.

So would Karl Popper, I think, were he here to respond to your comment. Popper says something very similar about scientific theories — which might also be called (very loosely) "scientific memes" — in his book CONJECTURES AND REFUTATIONS and elsewhere. In his explanation of the growth of scientific knowledge Popper expressly notes that the *origin* of a theory is irrelevant — what matters instead is its testability (aka "falsifiability"), which is the indicator of its potential to give us greater knowledge about the world . For example, Kekule's hypothesis about the ringed structure of the benzene molecule originated from a *dream* about a snake eating its tail. But this fact tells us nothing about the value of the the theory, which can only be established empirically.

Thus, dreams, which are arguably the most unstructured and disorded thinking that we ever do, nevertheless can be a source of "variation" as to hypotheses, and ultimately a guidepoint to greater knowledge.. Yet even if psychologists were to disagree violently about the relative importance of dreams as a a source of "variation"(read "new ideas), .it would not follow from this disagreement that variation itself is relatively unimportant to the growth of knowledge and culture.

Variation might take place without boundaries or favor, as Dawkins seems to suggest, or might be subject to mathematically predetermined paths, as biologists like Kaufman and Goodwin have proposed. In either case, evolution proceeds, through the mechanism of violence. That the theory of evolution can survive these unresolved controversies shows that it is really the culling and not the sowing that is the key mechanism.

I do not believe you have established a syllogism here. I don't see how the robustness of evolutionary theory in the absence of consensus about the sources of genetic variation entails your conclusion that "culling" is more important that "sowing." Both are necessary conditions for Darwin's "origin of species." In fact, Darwin expressly acknowledged that variation was a necessary part of his theory, even though he could articulate no theory as to the source of that variation.

The meme concept, first proposed by Richard Dawkins, is sometimes used to explain how ideas change, but also sometimes as an ideal for how ideas should change.

I think it's unclear to say that "memes" are a notion about "how ideas change." Better to say that they're a notion about how ideas compete with one another, substitute for one another, etc.. (And if "compete" is too telelogical, substitute the verb "interact.") Remember, Dawkins wants us to consider genes as basic units of evolutionary action..

Dennett, in "Darwin's Dangerous Idea" speaks of wishing to extinguish a meme that had infected the physicist Roger Penrose as if it were a freakish individual that should be subject to a eugenics campaign. If it weren't for the romance of evolution, "Memes" would just be a fancy way of pointing out that non-rigorous ideas are often subject to a popularity contest. One danger, however, in the meme idea is an equation of creativity with mental eugenics.

Without going into detail, let me say merely that, in my own experience, thinking about harmful ideas as "bad memes" has been extremely productive for me.

Equating ideas and genes revives all the worst old wrong ideas about genetics.

I think your use of "equating" unfairly dispenses with some of Dawkins's nuance.

Ideas do everything genes can't. They can change and effect each other without any concern for species boundaries. They can pass along traits acquired during their "lifespans"- they don't have to wait for some sub-strata of genetic material to be selected for. The long-resolved struggle against these mistaken ideas about genes has been irritated into existence again by a stupid metaphor. It is as if Darwin had never existed.

It may be that my understanding of genetics has faded since I studied it formally, but much of what you say here about ideas strikes me as self-evidently true about genes as well, .

For one thing, it's not just somatic cells that mutate, but gametic cells as well, and that the latter can pass on their mutations (often but not always deleterious changes). For another, don't ideas require "substrata" as much as genes do?. :Like paper, for example, or air (to transmit sound waves), or a brain?

The notion of memes is an affront to the idea that some ideas can be better than others.

It seems to me to _reinforce_ this very idea. Even we meme-lovers still regard some genes as more harmful than others — harmful either to an organism or to its offspring. Nor does any dispassionate discussion of the dissemination of a meme (a racist meme, say) require that we abandon our opposition that meme. Compare: Does the fact that an epidemiologist can study an epidemic's growth cycle dispassionately entail her abandoning her belief that dying of an infectious disease is a bad thing.

Nothing in Dawkins' metaphor requires us (either as moral actors or as knowledge builders) to think of all ideas as being of equal value *when we are engaged in the process of assessing value*. But the "meme" concept is about understanding the dynamic of the spread of thoughts - that's where its power as a metaphor lies.,. And the "meme" notion gives us us a way to understand the dynamics of the propagation of ideas that is not clouded by our own assessment of those ideas. In short, thinking about memes allows some of us to see the process more clearly.

Ideas can be rigorous, so the notion of improvement has meaning. Genes, on the other hand, don't improve; they just adapt to local circumstance.

I believe this is both incorrect and a category mistake. Strictly speaking, genes *can* improve (the rare beneficial mutation, for example), and it is not genes but _genotypes_ that adapt.

And that adaptation is entirely non-intentional and so slow that we learn about it largely from fossils.

No problem with your "non-intentional" here, but any bacteriologist, I imagine, can give you what amount to eye-witness accounts of evolutionary adaptation in action. That's one of the nice things about studying the genetics of organisms with short life cycles.

Many kinds of ideas, on the other hand, can be definitively improved, and this can be done methodically and cumulatively, leading to exponential rates of change. People used to believe God thought the world into existence in just this way, in six days. Darwin's central insight was that genes are not like ideas.

I don't recall his saying this. I do recall his recognition that variation is a prerequisite for natural selection. Which to me entails the conclusion that genes are not invariant after all.

Within civilization, nonetheless, are found pseudo-evolutionary processes, like business and the academic career track, in which competition is harnessed to produce excellence.

One can sidestep the road to social Darwinism and still believe that if "pseudo-evolutionary processes" quack just like evolutionary ones, waddle like them, swim and fly like them, why, then we can duck the use of "pseudo." altogether.

These should not be understood to be true examples of evolution, though, because the genes of the losers are still passed on without diminution.

Jaron, I'm not sure I understand your point here, since each of us — self-evidently the product of our forebears' survival to repductive age — nevertheless carries in his or her genotype lots of "loser" genes. Unless an allele is lethal to the organism prior to the organism's self-reproduction, the Hardy-Weinberg paradigm more or less still applies, and gene frequencies — even for ultimately harmful genes! — in a large population don't change much. (A study of sickle-cell anemia is instructive on this point.)

Commonly it's at the phenotype level that we decide which individuals are "losers" in a particular evolutinary context. — other individuals who carry the same undesirable allele may well qualify as "winners" in Darwinian terms (they last long enough to reproduce) because their overall phenotype neutralized or minimized the :"loser" effect of that allele. Me, I take Dawkins's argument in THE SELFISH GENE to be in part about transcending this phenotype centric :"winner/loser" perspect.ive.

I agree of course that one must not *glibly equate* genes and memes. While I still like the notion, I also concede there are countless ways in which this metaphor falls short in representing reality,. Yet isn't this a trivial criticism, given that *all* metaphors — being comparisons of things that are alike yet also different — are :necessarily "false" to some degree?.

This irrreducible falsehood of metaphors shouldn't bother us much — metaphors are meant to be used as tools, not as truths.. And if the tool doesn't work for you, you can abandon it without concluding that it doesn't work for anyone else, either.

Even their "memes' are passed on, for those who insist on subscribing to the concept. That is what defines a civilization. If civilization worked like evolution, it would be perfectly ordinary to burn library books that had not been read for a long time.

As Nicholson Baker has documented, this is in fact perfectly ordinary.

In the real world, when libraries burn, civilizations crumble.

If only this were true. Then book-burning civilizations would invariably die with greater frequency than book-loving ones. But so far as I can tell, all civilizations, including the most literate ones we know of, end up dying, regardless of how nicely they treat their books.


Post: 4 Submitted: 12-27-96
From: Jaron Lanier

To: Mike Godwin

Hello there again,

We do agree on plenty of things. I love Popper's insights on scientific method as much as you do. Alas, no one has yet done such clear work as Popper's to help us choose our metaphors. In examining my criteria for them, and why memes annoy me so, I can propose a starting place: A metaphor ought to inform more than it confuses. Furthermore, it shouldn't unwittingly undermine other notions that

one wishes to keep in one's head.

I originally started to dislike memes when I heard students talking about real genes in Lamarkian terms. It turns out they had worked backwards from memes, assuming that ideas must be a reasonable metaphor for genetics in some way. I had to set them straight on that. That set me to wondering if the metaphor worked any better in the forward direction. Since it's very very hard to falsify ideas about ideas, we have to be extra careful about our metaphors for them.

And the "meme" notion gives us us a way to understand the dynamics of the propagation of ideas that is not clouded by our own assessment of those ideas. In short, thinking about memes allows some of us to see the process more clearly.

This I cannot accept. You're making a claim here that you're seeing a process that actually happens, and that you can see it more clearly with the metaphor in mind. First, I worry about the notion of someone becoming a dispassionate observer of ideas, without assessing them. I'm not sure that's possible, and that's a primary problem with the meme metaphor. Can you identify an idea by superficial features, like you can identify an organism? Is it possible to identify an idea without internalizing it? The example I cited in Dennett is not the only one I've seen in which the meme metaphor serves as a tool to help the bearer become somewhat cynical and distanced from the ideas of others.

But I also wonder what process the metaphor of memes can help you observe. Where is the genetic material for an idea?

For another, don't ideas require "substrata" as much as genes do?. :Like paper, for example, or air (to transmit sound waves), or a brain?

You suggest paper and air, but those aren't linked to specific ideas in the way that a particular set of genes are linked to a particular organism. Maybe the metaphor could be lined up in different ways; to the genotype, or wherever, or maybe the idea is like the gene and a behavioral action is like an organism. I've tried to find a way to make the metaphor work! No matter how I try, I can't find a reducible sub-strata in the life of ideas to hang on it. If the meme metaphor informs, it should be possible to name this sub-strata. Can you name it?

Ideas can be rigorous, so the notion of improvement has meaning. Genes, on the other hand, don't improve; they just adapt to local circumstance.

I believe this is both incorrect and a category mistake. Strictly speaking, genes *can* improve (the rare beneficial mutation, for example)

In this case I think you are being confused by putting the meme metaphor into reverse gear, like my Lamarkian students. Surely adaptation is only local, while a mathematical theorem is global. A scientific idea, once falsified, is permanently falsified, while a vanished genetic feature might someday reappear if local circumstances change to once again favor it.

And that adaptation is entirely non-intentional and so slow that we learn about it largely from fossils.

No problem with your "non-intentional" here, but any bacteriologist, I imagine, can give you what amount to eye-witness accounts of evolutionary adaptation in action. That's one of the nice things about studying the genetics of organisms with short life cycles.

You're right on this point. What I meant to say is that the genetic rate of change is far slower than the pace of events in the life of an organism. If the meme metaphor informs, once the "genetic" sub-strata has been named, it ought to change very slowly, relative to the pace of discourse. Or if the metaphor should be lined up differently, and the ideas are the genes, there ought to be a faster moving "organism" equivalent that speeds past our ideas.

Evolution is an evil thing. All your genetic features are the result of the pre-reproductive deaths of your would-be ancestors. They were killed in cold blood by your real ancestors, or by micro-organisms, or by cold or hunger. Your features weren't decided by a nice process. If we really want to understand human discourse by making a metaphor with the heart of cruelty, we ought to have a good reason.

I'm not saying the meme metaphor never works at all. When the last copy of a book concerning non-rigorous ideas is destroyed, I think the metaphor might start to work a bit. You could say the book is like genetic material, slower moving than discourse, with discourse being the organism, and that future discourse on related non-rigorous ideas is shaped a bit by the book's absence. While this does happen, the meme metaphor is most popular in the sciences, where it doesn't fit.

For what it's worth, when I presented my arguments to Dawkins, he agreed with them, and said he thought "memes" had been taken too far. You can read what he says about this in his own words in the Psych Today piece, when it comes out.


Post: 5 Submitted: 12-27-96
From: Mike Godwin

To: Jaron Lanier

In examining my criteria for them, and why memes annoy me so, I can propose a starting place: A metaphor ought to inform more than it confuses.

Well, perhaps it says something that I disagree with your "starting place" premise. I'm not sure I can say with precision what it is that metaphors do when they aid in understanding, but I don't think "inform" is the right verb. As I said previously, metaphors are tools, not truths. Kind of like what (as I recall) Wittgenstein said the Tractatus should be considered as — a sort of ladder to the next level that you can throw away once you're up there.

I originally started to dislike memes when I heard students talking about real genes in Lamarkian terms.

If undergraduate misuse of newly acquired notions is all it takes to generate your initial dislike of those notions, I begin to shudder at the implications. (This is a joke.)

Since it's very very hard to falsify ideas about ideas, we have to be extra careful about our metaphors for them.

I'm inclined to say that Dawkin's "meme" notion is simply a metaphor and not a scientific theory. A very powerful metaphor, true, and perhaps even a harmful one. But not something whose unfalsifiability I'd normally worry much about.

And the "meme" notion gives us us a way to understand the dynamics of the propagation of ideas that is not clouded by our own assessment of those ideas. In short, thinking about memes allows some of us to see the process more clearly.

This I cannot accept. You're making a claim here that you're seeing a process that actually happens, and that you can see it more clearly with the metaphor in mind.

The problem is less my proposition, I think than it is my poor usage. Rather than "see the process more clearly" ( a phrase that connotes actual observation), I should have written something like "think about the process more clearly."

You may still disagree with the amended claim, but I don't mean for it to be taken as a claim about observations.

First, I worry about the notion of someone becoming a dispassionate observer of ideas, without assessing them.

I believe this is a false dichotomy, since (in my view) one can be a *passionate* observer of ideas (and of other human creations) without imposing a value system upon them. Some of my anthropologist friends, for example, seem to me to be doing just this.

Can you identify an idea by superficial features, like you can identify an organism?

I'm not sure what you're getting at with "superficial" here, but I do think ideas can be classified by clearly discernable features. For example, I believe this is what Popper does with his science/nonscience demarcation criterion.

Is it possible to identify an idea without internalizing it?

I think so. For example, I believe I can identify a Marxist proposition without adopting it.

You suggest paper and air, but those aren't linked to specific ideas in the way that a particular set of genes are linked to a particular organism.

When you used the word "substrate," I found myself thinking of nucleic acids, which, of course are no more specific to a particular gene than paper is specific to a particular idea. I'm still not sure I follow your reasoning here.

I believe this is both incorrect and a category mistake. Strictly speaking, genes *can* improve (the rare beneficial mutation, for example)

In this case I think you are being confused by putting the meme metaphor into reverse gear, like my Lamarkian students. Surely adaptation is only local, while a mathematical theorem is global.

Actually, your response suggests a rather different confusion. I don't believe "local" and "global" are terms that represent objective reality.

Popper might have said that a mathematical theorem actually *is* "local" — it is located in what Popper calls World 3 (the shared domain of human ideas) and it is *not* located under under my bed.

I don't think your local/global distinction is helpful, but you may be reaching for something like the a priori/a posteriori distinction. In any case, once again I have trouble following you.

A scientific idea, once falsified, is permanently falsified, while a vanished genetic feature might someday reappear if local circumstances change to once again favor it.

Popper would say that falsified scientific theories remain in World 3. (They're just reshelved in the "falsified" section.)

I was taught that vanished genetic features *never* simply reappear. E.g., the mammalian species that returns to the sea does not grow scales, even though its long-ago forebears may have had them. Instead, it develops analogous structures or perhaps even arrives at a wholly different solution to the adaptation problem.

You're right on this point. What I meant to say is that the genetic rate of change is far slower than the pace of events in the life of an organism.

This is absolutely right, IMHO, and, incidentally, one of the implications of the Hardy Weinberg equation (or so it seems to me).

If the meme metaphor informs....

Again, I'm uncomfortable with the assumption that metaphors "inform."

Evolution is an evil thing. All your genetic features are the result of the pre-reproductive deaths of your would-be ancestors. They were killed in cold blood by your real ancestors, or by micro-organisms, or by cold or hunger.

Some of them were just too lazy to fuck, Jaron.

I'm not saying the meme metaphor never works at all.

The science of metaphors is never a precise one, I'm thinking.

For what it's worth, when I presented my arguments to Dawkins, he agreed with them, and said he thought "memes" had been taken too far.

Although I disagree with some of what you see, I certainly agree with you and Dawkins (and Danny Hillis) that the notion has been taken too far.

Of course, when my book comes out this spring, you may find that its prolix discussions of memes and media damn me as another culprit in the current meme overload. I'm wincing in anticipation.


Post: 6 Submitted: 12-27-96
From: Jaron Lanier To: Mike Godwin

Hello again, Mike,

You've written so much in this exchange, but not said anything specific about what memes do for you. What do they mean, what ladder do they help you throw away? I accept that a metaphor can be imprecise, but there's got to be something to hang your hat on. Isn't it reasonable for me to ask, What exactly is the genetic material in culture or ideas that the meme metaphor refers to? If the metaphor is so imprecise as to make even that question out-of-bounds, what can there possibly be to talk about?

One moment you place memes in the robust company of Popper and Bateson, and the next they are but a wispy and receding bit of intuitive poetry. If you're passionate enough to write so much to me thus far, please show me how you can use memes for something. You've taken so much time to complain about my choices in terminology. If you wish to set the standards of our discussion such that "global" isn't Popperian enough, you really ought to also make room for a few sentences in which you show your cards. Surely you knew ideas could be dangerous before memes came along. Is that all they've done for you?

I could tell you other reasons I don't like memes. The meme idea seems to suggest that information science can provide a more fundamental perspective on culture than other approaches, and I don't think it can.

But of course you'll tell me I've misused the word "culture" and misstated the meaning of memes. Fine. Come out of the shadows and show me something that survives in the sunlight next to the ideas of Bateson or Popper.

Burroughs said, "Language is a virus." Isn't that a better biological metaphor for culture? You can tell right away it's a metaphor, for one thing. What's so special about memes? The word "meme" sounds technical while actually being entirely imprecise. The design of the term "meme", and the way it's held in reverence by aficionados does seem to suggest it's a "big idea". The appeal might have something to do with the desire to say as much as possible with existing ideas. Since we already have Darwin, it's tempting to make everything Darwinian. Since the computer is such a dominant metaphor these days, it's tempting to make everything algorithmic. There's also a "campus imperialism" where opposing disciplines all seek to appear more fundamental. Memes help nerds feel more fundamental than humanities types.

The reason I have so much energy to write back to you is that I have felt persistently unsatisfied with an "infocentric" world view that seems to be prevalent in computer circles. This is related to my arguments against AI interfaces. I worry that we're changing our ideas about ourselves, and maybe even changing ourselves, in order that we can be more easily represented by information technology as we conceive it. Yes, I can already hear you saying you don't understand this last sentence (even though I'm making an argument that I think qualifies as "Batesonian"), so rest assured that YOU'LL also be able to cringe at a book of mine, a humanist tract that will someday, after heroic procrastinations, elaborate such statements in great detail.


Post: 7 Submitted: 12-27-96
From: Mike Godwin To: Jaron Lanier

Only the thought that at least I can be certain of the existence of a "virulent meme"-the bad idea that propagates itself from mind to mind via a communications medium or culture-since Jaron clearly regards the "meme meme" as an idea of that stripe.

Enjoy your holidays.

Mike


Post: 8 Submitted: 12-27-96
From: Jaron Lanier To: Mike Godwin

Language as a virus, that's the Burroughs metaphor. Since you've sent me this "I gotcha" message twice, I have to say memes aren't memes especially if they are "virulent". Genes per se aren't "virulent" (except maybe a little bit in the very rare "horizontal transfers"). The virus metaphor I can understand, but a virus-like gene? That's an idea?

Best,

Jaron


Post: 9 Submitted: 12-27-96
From: Mike Godwin To: Jaron Lanier

Language as a virus, that's the Burroughs metaphor.

You've misunderstood. Burroughs is talking about *all of language*. The notion of a single idea acting virally is a wholly separate and distinct metaphor.

Since you've sent me this "I gotcha" message twice, I have to say memes aren't memes especially if they are "virulent". Genes per se aren't "virulent" (except maybe a little bit in the very rare "horizontal transfers"). The virus metaphor I can understand, but a virus-like gene? That's an idea?

It's an idea I picked up in my genetics courses in fact-specifically, the idea that the difference between a virus and a gene isn't particularly great.

Of course genes per se aren't virulent. If they were, there'd be no point in adding the descriptor "virulent."-

Mike


Post: 10 Submitted: 12-27-96
From: Mike Godwin

To: Jaron Lanier

Mike Godwin wrote something ineptly:

Since you've sent me this "I gotcha" message twice, I have to say memes aren't memes especially if they are "virulent". Genes per se aren't "virulent" (except maybe a little bit in the very rare "horizontal transfers"). The virus metaphor I can understand, but a virus-like gene? That's an idea?

It's an idea I picked up in my genetics courses in fact - specifically, >the idea that the difference between a virus and a gene isn't particularly great.

I cannot believe my sentence slipped out in this form. (My deadlines loom and I'm not playing at the top of my game here.) Obviously, the comparison should be between a virus and *a collection of related genes* - every virus I know of carries more self-reproducing nucleic-acid content than that of a single gene. That's what I meant to say.

I will grant that the fact that genes tend to travel in groups, as has been well established in molecular biology, undercuts Dawkins's insistence that the gene is (or ought to be) the central element in understanding evolutionary processes. Given that fact, Dawkins's use of "gene" is itself a sort of metaphor for "various units or groups of genetic material." Actually, it's another kind of trope. Metonymy, maybe?

But group-travelling genes don't undercut the larger comparison between the propagation of social information and the propagation of genetic information.-

Mike


Post: 11 Submitted: 1-6-97
From: Jaron Lanier

To: Mike Godwin

But group-travelling genes don't undercut the larger comparison between the propagation of social information and the propagation of genetic information.

I've still never heard the genetic material in genes identified. Or any other component of the metaphor. I think memes are really "campus imperialism" as I said before, but without any blanks filled in. I have "caught" viruses from others, as well as ideas. My genes, however, were merely left over after the brutal culling of evolution. Viruses and genes have a strong material overlap but a weak functional overlap.

Best,

Jaron


Post: 12 Submitted: 1-7-97
From: Charles Simonyi

To: Jaron Lanier

Dear Jaron,

While I agree with your points on the differences between memes and genes, nonetheless I think memes are very useful because I interpret them more narrowly than you (and most people) do. To me, memes are those ideas which are spread non-intentionally, that is they are misunderstood or not-understood at all while being spread. Just because humans are conscious, it does not mean that they do everything consciously, or that all of their behavior is under conscious control, certainly not riding a bicycle (their bicycle-riding-skill), and maybe not even being a good politician. It is exactly in these unintentional situations where your objections do not hold and the parallels of memes with genes and Darwinian evolution become more useful.

So politicians may not even realize that their ambiguous statements work because they are misunderstood (e.g. "I want change") but the meme would spread because it works. Yes, I am that nive that I could believe that they do not realize it, rather than that they are all evil geniuses who plan it that way. Now if a scientist or advertising person comes and asks "why does this work and how can we improve it?" we have directed evolution as in breeding or biotech, rather than Darwinian selection, but there still remains some valid parallel with the biological world, namely the interjection of conscious human intention in both instances.

To re-iterate, I think your statements are fine under a general interpretation of meme but a better response is to narrow the definition of meme to the more exact parallel where the Darwinian lessons work and can give new insight into human individual and mass behavior.

Have a meme new year,

Charles


Post: 13 Submitted: 1-8-97
From: Jaron Lanier

To: Charles Simonyi

Ironically, the connecting of memes to the non-conscious could be read as being quite compatible with some of the recent fancy French "semiotics" writings, which derive more from Freud and Jung than from Darwin. It still seems to me that the closer biological metaphor for ideas, especially unconscious ones, would be with viruses ("iruses", "idearuses", "virideas"?). I argued this point to Mike Godwin, who thought I was making too much of a distinction between genes and viruses. I responded: Genes per se aren't "virulent" (except maybe a little bit in the very rare "horizontal transfers"). Viruses and genes have a strong material overlap but a weak functional overlap.

At any rate, it seems to me that memes are sometimes taken on as an almost religious banner by devotees, which is a credit to Richard's sense of poetry. I think it's important, however, to challenge ideas that are loudly adopted without careful and critical consideration.

I would disagree with your defense of Dan Dennett. Although he and I have gotten along very well when we've met and debated at conferences, I do think he goes too far in his characterizations of opponents. At times he feels to me more like an auto-immune disease than viral research. While Penrose might be wrong (and I think he IS wrong, by the way), he is certainly not arguing out of his unconscious.

What I've argued to Dan Dennett, and to Richard, is that when science is presented in a way that it unnecessarily challenges peoples' sense of emotional, moral, or spiritual feeling, we are inviting a resurgence of creationists and other demons that should have long ago been put to rest. "Memes" and "AI" both suggest that human psychology is more close to being comprehended than it is, and we weaken ourselves when we give implicit support to exaggerated positions.

You're probably thinking to yourself, "Isn't this the guy who coined the term Virtual Reality? Who's he to point the finger for this sort of exaggeration?" Yes, that was me, and I am trying to learn from my mistakes. VR was a fantastic marketing term for a line of important research that would have been harder to fund without it. But I also had to work hard to achieve even meager progress in correcting the misunderstandings the term created in many circles, especially in pop culture.

I DO think the worldwide rise in fundamentalism has something to do with science being at the threshold of investigating intimate aspects of human identity. If the sciences can quell some of this reaction by finding a way to be more warm, humble, and clear, a great service will be performed. There are aspects of the attitude adopted by scientists that are neutral to the quality of scientific practice and discourse. It would not compromise science in the least to avoid unnecessary provocation of peoples' worst fears of being reduced to malleable data by super-nerds and gene hackers.



Copyright ©1997 by Edge Foundation, Inc.

Back to EDGE INDEX

Home | Digerati | Third Culture | The Reality Club | Edge Foundation, Inc.

EDGE is produced by iXL, Inc.
Silicon Graphics Logo


This site sponsored in part by Silicon Graphics and is authored and served with WebFORCE® systems. For more information on VRML, see vrml.sgi.com.