Edge 197— November 20, 2006
(14,650 words)



BEYOND REDUCTIONISM
Reinventing The Sacred

By Stuart A. Kauffman

Two fine authors, Richard Dawkins and Daniel Dennett, have written recent books, The God Delusion and Breaking the Spell arguing against religion. Their views are based on contemporary science. But the largest convictions of contemporary science remain based on reductionism.

I would like to begin a discussion about the first glimmerings of a new scientific world view — beyond reductionism to emergence and radical creativity in the biosphere and human world. This emerging view finds a natural scientific place for value and ethics, and places us as co-creators of the enormous web of emerging complexity that is the evolving biosphere and human economics and culture. In this scientific world view, we can ask: Is it more astonishing that a God created all that exists in six days, or that the natural processes of the creative universe have yielded galaxies, chemistry, life, agency, meaning, value, consciousness, culture without a Creator. In my mind and heart, the overwhelming answer is that the truth as best we know it, that all arose with no Creator agent, all on its wondrous own, is so awesome and stunning that it is God enough for me and I hope much of humankind.

The Reality Club: Jaron Lanier

[...more below]



Saturday, November 18, 2006

I'm an atheist, BUT . . .
By Richard Dawkins

Of all the questions I fielded during the course of my recent book tour, the only ones that really depressed me were those that began "I'm an atheist, BUT . . ." What follows such an opening is nearly always unhelpful, nihilistic or – worse – suffused with a sort of exultant negativity. Notice, by the way, the distinction from another favourite genre: "I used to be an atheist, but . . ." That is one of the oldest tricks in the book, practised by, among many others, C S Lewis, Alister McGrath and Francis Collins. It is designed to gain street cred before the writer starts on about Jesus, and it is amazing how often it works. Look out for it, and be forewarned.

I've noticed five variants of I'm-an-atheist-buttery, and I'll list them in turn, in the hope that others will recognize them, be armed against them, and perhaps extend the list by contributing examples from their own experience.

1. I'm an atheist, but religion is here to stay. You think you can get rid of religion? Good luck to you! You want to get rid of religion? What planet are you living on? Religion is a fixture. Get over it!

I could bear any of these downers, if they were uttered in something approaching a tone of regret or concern. On the contrary. The tone of voice is almost always gleeful, and accompanied by a self-satisfied smirk. Anybody who opens with "I'm an atheist, BUT . . ." can be more or less guaranteed to be one of those religious fellow-travellers who, in Dan Dennett's wickedly perceptive phrase, believes in belief. They may not be religious themselves, but they love the idea that other people are religious. This brings me to my second category of naysayers.

2. I'm an atheist, but people need religion. What are you going to put in its place? How are you going to comfort the bereaved? How are you going to fill the need?

. . . Did you notice the patronizing condescension in the quotations I just listed? You and I, of course, are much too intelligent and well educated to need religion. But ordinary people, hoi polloi, the Orwellian proles, the Huxleian Deltas and Epsilon semi-morons, need religion. Well, I want to cultivate more respect for people than that. I suspect that the only reason many cling to religion is that they have been let down by our educational system and don't understand the options on offer. This is certainly true of most people who think they are creationists. They have simply not been taught the alternative. Probably the same is true of the belittling myth that people 'need' religion. On the contrary, I am tempted to say "I believe in people . . ." And this leads me to the next example. . . .

[...continue]



Friday, October 27, 2006 - Opinion

Less Faith, More Reason
By STEVEN PINKER
It is an American anachronism, I think, in an era in which the rest of the West is moving beyond it.

There is much to praise in the new Report of the Committee on General Education. It is original, thoughtful, and well-written, and reflects considerable work on the part of our colleagues on the Task Force on General Education. The entire Harvard community should be grateful for the progress they have made and the issues they have asked us to address.

...The report introduces scientific knowledge as follows: "Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment."

Well, yes, and I suppose one could say that architecture has produced both museums and gas chambers, that opera has both uplifted audiences and inspired the Nazis, and so on. It makes it sound as if the choice between science and technology on the one hand, and superstition and ignorance on the other, is a moral toss-up! Of course students should know about both the bad and good effects of technology. But this hardly seems like the best way for a great university to justify the teaching of science.

The report goes on to emphasize the relevance of science to current concerns like global warming and stem-cell research. It even mandates that courses which fulfill the Science and Technology requirement "frame this material in the context of social issues" (a stipulation that is absent from other requirements). But surely there is more to being knowledgeable in science than being able to follow the news. And surely our general science courses should aim to be more than semester-long versions of "An Inconvenient Truth."

... My second major reservation concerns the "Reason and Faith" requirement.

First, the word "faith" in this and many other contexts, is a euphemism for "religion." An egregious example is the current administration's "faith-based initiatives," so-named because it is more palatable than "religion-based initiatives." A university should not try to hide what it is studying in warm-and-fuzzy code words. Second, the juxtaposition of the two words makes it sound like "faith" and "reason" are parallel and equivalent ways of knowing, and we have to help students navigate between them. But universities are about reason, pure and simple. Faith—believing something without good reasons to do so—has no place in anything but a religious institution, and our society has no shortage of these. Imagine if we had a requirement for "Astronomy and Astrology" or "Psychology and Parapsychology." It may be true that more people are knowledgeable about astrology than about astronomy, and it may be true that astrology deserves study as a significant historical and sociological phenomenon. But it would be a terrible mistake to juxtapose it with astronomy, if only for the false appearance of symmetry. ...

[...continue]


MY GOD PROBLEM
By Natalie Angier

So, on the issue of mainstream monotheistic religions and the irrationality behind many of religion's core tenets, scientists often set aside their skewers, their snark, and their impatient demand for proof, and instead don the calming cardigan of a a kiddie-show host on public television. They reassure the public that religion and science are not at odds with one another, but rather that they represent separate "magisteria," in the words of the formerly alive and even more formerly scrappy Stephen Jay Gould. Nobody is going to ask people to give up their faith, their belief in an everlasting soul accompanied by an immortal memory of every soccer game their kids won, every moment they spent playing fetch with the dog. Nobody is going to mock you for your religious beliefs. Well, we might if you base your life decisions on the advice of a Ouija board; but if you want to believe that someday you'll be seated at a celestial banquet with your long-dead father to your right and Jane Austen to your left-and that she'll want to talk to you for another hundred million years or more—that's your private reliquary, and we're not here to jimmy the lock.

[...more below]




November 19, 2006

The Galileo effect: dangerous ideas waiting to happen

A group of scientists has been given freedom to express heretical theories. Steve Farrar reports

Scientists and empirical thinkers have always generated dangerous ideas as they wrestle with evidence and theories that appear to contradict conventional wisdom and widely accepted social mores. Dawkins sees this as healthy for society. "Dangerous ideas are what has driven humanity onward, usually to the consternation of the majority in any particular age who thrive on familiarity and fear change," he says. "Yesterday's dangerous idea is today's orthodoxy and tomorrow's cliché." He adds, however, that it is patently not enough for an idea just to be dangerous. It must also be good.

It was, of course, a particularly good idea to bring this remarkable group of scientists and thinkers together. Few would have been capable of doing so. But not for nothing has Brockman been described by Dawkins as having "the most enviable address book in the English-speaking world". More than that, though, he has an insatiable hunger for ideas and intellectual debate. Back in the 1960s, when Brockman was working alongside the likes of Bob Dylan, Andy Warhol and Hunter S Thompson as an avant-garde arts promoter, he was invited regularly to dine and debate with John Cage, the composer and philosopher, and a small group of fiercely bright young artists and scientists. The experience had a profound impact on him." Out of that I got an appreciation for almost the purity of ideas and the excitement of rubbing shoulders with people that could challenge you," he says.

When his friend, the late conceptual artist James Lee Byars, proposed getting together 100 of the world's greatest thinkers to debate with one another in a single room, Brockman shared his excitement at the prospect of an explosion of ideas. And although the project — the World Question Centre — never got off the ground, the concept lived on. Working with Heinz Pagels, the physicist, Brockman later founded the Reality Club so that top thinkers could spar with and inspire one another over dinner. In 1997 he took this informal conversation into cyberspace with the online magazine Edge. It is here that the intellectual elite that he has gathered now thrash out their often contrary views. And it is here that each year on January 1, Brockman posts the group's answers to a different, deceptively simple question. In 2005 it was: "What do you believe to be true, but cannot prove?" Last year it was: "What is your dangerous idea?"

The question was proposed by the psychologist Steven Pinker, a prominent member of the group. "I suggested to John Brockman that he devote his annual Edge question to dangerous ideas because I believe that they are likely to confront us at an increasing rate and that we are ill-equipped to deal with them," Pinker says. He notes that such ideas get loaded with ethical implications that in retrospect often seem ludicrous. The urge to suppress heretical views is, Pinker declares, a recurring human weakness.

[...continue]



November 19, 2006



Perils of Wisdom

We talk about thinking out of the box but some ideas don't even get off the ground because of cultural taboos or political correctness. Here, five experts – including Richard Dawkins – propose the unthinkable …

Today's most shocking pro posals are those that provoke outrage: not among the religious or political establishments, but in the heart of every well-meaning, peace-loving, Make Poverty History-marching denizen of the world. Dangerous ideas, according to psychologist Steven Pinker, "are denounced not because they are self-evidently false, nor because they advocate harmful action, but because they are thought to corrode the prevailing moral order" and "challenge the collective decency of an age".

Are suicide bombers driven by sane, moral motives? Do African-American men tend to have higher levels of testosterone than whites? Could it be that some sexual abuse victims suffer no lifelong damage? Have religions caused more human suffering than the Nazis? Is homosexuality the symptom of an infectious disease? Pinker reels off a long list of suggestions that have caused "moral panics" during recent decades. Which of them makes your blood boil?

But hurt feelings are not a measure of the legitimacy of a scientific hypothesis, and Pinker's point is that in attempting to advance our understanding, progressive thinkers must be prepared to question sacred values and break the taboos of political correctness. Scientists, he adds, have always been heretics, and today, "the galloping advances in touchy areas like genetics, evolution and the environment sciences are bound to throw unsettling possibilities at us. Moreover, the rise of glo bal isation and the internet are allowing heretics to find one another and work around the barriers of traditional media and academic journals."

The website, www.edge.org, founded by writer John Brockman, allows leading thinkers to engage in uncensored debate, by inviting responses to one provocative question each year. In 2006, Steven Pinker was asked to come up with a query designed to get their intellectual juices flowing. Pinker dared the Edge community to propose "an idea that is dangerous not because it is assumed to be false, but because it might be true". The responses are collected in a new book published this week. Overleaf, we present a selection of the most explosive ideas of our age.

[...continue]



November 17 , 2006

The sexiest man living!

Forget that other list. We pick the men who really set our hearts aflame -- and there's nary a pretty-boy actor among them.



Who:
Richard Dawkins
Age: 65
Know him as: Evolutionary scientist and author, most recently of "The God Delusion."

Wonder is sexy. Knowledge is sexy. And embodying both as much as any man in the world today is a man in a tweed jacket riding his bike around the Oxford University campuses, the damp English breeze sweeping a curtain of silver hair from the delicate bones of his face. Yes, those cheekbones, those piercing eyes, that pursed bow of a mouth -- but that brain, oh that brain, oh, god, that brain -- is what makes Richard Dawkins, evolutionary biologist and the most famous atheist in the world, the sexiest man around.

Dawkins is the professor I never had an affair with, whose very sentence structure threatens to weaken my concentration on the content of his words. Call me deluded: I ache for his atheism; I reel from his reasoning. He is my James Bond, a well-attired, fearless seeker of truth in the face of nihilism.

I dream of his perfectly-accented voice -- Oxbridge softened by a childhood spent in, sigh, East Africa -- whispering to me from his latest book, "The God Delusion," a defense of endless curiosity in the face of omnipresent theism. "If the demise of god will leave a gap, different people will fill it in different ways. My way includes a good dose of science, the honest and systematic endeavor to find out the truth about the real world." Take me with you, Richard: You put the "sex" in sexagenarian. Let us clinch in a godless embrace, crying out to what we know does not exist, searching, searching evermore.

— Lauren Sandler

[...continue]



November 12 , 2006

Things We Like
What's good out there? What have you read, heard or seen in the last month that really moved and informed you? Books, stories, movies, articles, TV, music, etc.? Send us your suggestions!

Book, nonfiction: "What We Believe but Cannot Prove," edited by John Brockman. The editor, who also runs the very influential Web site Edge (http://www.edge.org), asks some of the most brilliant people in the world one heck of a good question.

[...continue]



November 2006

The Evolution of Future Wealth
Technologies evolve much as species do, and that underappreciated fact is the key to growth

By Stuart A. Kauffman

As economics attempts to model increasingly complicated phenomena, however, it would do well to shift its attention from physics to biology, because the biosphere and the living things in it represent the most complex systems known in nature. In particular, a deeper understanding of how species adapt and evolve may bring profound--even revolutionary--insights into business adaptability and the engines of economic growth.

One of the key ideas in modern evolutionary theory is that of preadaptation. The term may sound oxymoronic but its significance is perfectly logical: every feature of an organism, in addition to its obvious functional characteristics, has others that could become useful in totally novel ways under the right circumstances. The forerunners of air-breathing lungs, for example, were swim bladders with which fish maintained their equilibrium; as some fish began to move onto the margins of land, those bladders acquired a new utility as reservoirs of oxygen. Biologists say that those bladders were preadapted to become lungs. Evolution can innovate in ways that cannot be prestated and is nonalgorithmic by drafting and recombining existing entities for new purposes--shifting them from their existing function to some adjacent novel function--rather than inventing features from scratch.

[...continue]



Weekend Edition, November 11-12, 2006

Essays and Opinion

In the 15th century, an emerging middle class had the portrait as a means of public exposure, says Hubert Burda. Today, it's YouTube. Some things never change... more»



November 10, 2006

Losing Our Religion
A gathering of scientists and atheists explores whether faith in science can ever substitute for belief in God.

By Jerry Adler

The great Danish physicist Niels Bohr, it is said, had a good-luck horseshoe hanging in his office. "You don't believe in that nonsense, do you?" a visitor once asked, to which Bohr replied, "No, but they say it works whether you believe in it or not."

If one thing emerged from the "Beyond Belief" conference at the Salk Institute in LaJolla, Calif. it's that religion doesn't work the same way. Some 30 scientists—one of the greatest collections of religious skeptics ever assembled in one place since Voltaire dined alone—examined faith from the evolutionary, neurological and philosophical points of view, and they concluded that some things only work if you do believe in them. Richard Dawkins, the British evolutionary biologist and author of the best-selling book "The God Delusion," said he couldn't have a spiritual experience even when he tried. After another panelist, neuroscientist V.S. Ramachandran of the University of California, San Diego, explained that temporal-lobe seizures of the brain create profound spiritual and out-of-body experiences, Dawkins disclosed that he had participated in an experiment that was supposed to mimic such seizures—and even then he didn't feel a thing.

Dawkins obviously feels this loss is a small price to pay for freedom from superstition. But even physicist Steven Weinberg, a Nobel laureate and an outspoken atheist, acknowledged that science is a poor substitute for the role religion plays in most peoples' lives. It's hard, he said, to live in a world in which one's highest emotions can be understood in biochemical and evolutionary terms, rather than a gift from God. Instead of the big, comforting certainties promoted by religion, science can offer only "a lot of little truths" and the austere pleasures of intellectual honesty. Much as Weinberg would like to see civilization emerge from the tyranny of religion, when it happens, "I think we will miss it, like a crazy old aunt who tells lies and causes us all kinds of trouble, but was beautiful once and was with us a long time."

To which Dawkins retorted, "I won't miss her at all." Only in the most extreme circumstances would he deign to take account of the consolations offered by religion. He would not, for instance, try to talk a Christian on his deathbed out of a belief in Heaven. He didn't say what he would do if he were the one near death, but it's unlikely he would be calling for a priest. The atheist philosopher Daniel Dennett had been expected to attend, but two weeks earlier had been rushed to the hospital with a near-fatal aortic rupture. At the conference, people handed around copies of Dennett's essay entitled "Thank Goodness," posted on the science Web site Edge.org, in which he described how annoying it was to hear from friends that they had been praying for his recovery. "I have resisted the temptation," he wrote, "to respond, 'Thanks, I appreciate it, but did you also sacrifice a goat?'"

[...continue]



BEYOND REDUCTIONISM
Reinventing The Sacred
By Stuart A. Kauffman

Introduction

Stuart A. Kauffman studies the origin of life and the origins of molecular organization. Thirty-five years ago, he developed the Kauffman models, which are random networks exhibiting a kind of self-organization that he terms "order for free." He asks a question that goes beyond those asked by other evolutionary theorists: if selection is operating all the time, how do we build a theory that combines self-organization (order for free) and selection? The answer lies in a "new" biology:

"While it may sound as if 'order for free' is a serious challenge to Darwinian evolution, it's not so much that I want to challenge Darwinism and say that Darwin was wrong. I don't think he was wrong at all. I have no doubt that natural selection is an overriding, brilliant idea and a major force in evolution, but there are parts of it that Darwin couldn't have gotten right. One is that if there is order for free — if you have complex systems with powerfully ordered properties — you have to ask a question that evolutionary theories have never asked: Granting that selection is operating all the time, how do we build a theory that combines self-organization of complex systems — that is, this order for free — and natural selection? There's no body of theory in science that does this. There's nothing in physics that does this, because there's no natural selection in physics — there's self organization. Biology hasn't done it, because although we have a theory of selection, we've never married it to ideas of self-organization. One thing we have to do is broaden evolutionary theory to describe what happens when selection acts on systems that already have robust self-organizing properties. This body of theory simply does not exist." (From "Order for Free", Chapter 20, The Third Culture, 1995)

In the following essay, Kauffman frames a new scientific world view of emergence and ceaseless creativity, which, he notes, is "awesome in what has come to pass in reality, and God enough for me and many, where God is the creativity of the universe, yielding a global ethics of respect for all life, the planet, awe, wonder and spirituality cut free from a transcendent God."

JB

STUART A. KAUFFMAN is a professor at the University of Calgary with a shared appointment between biological sciences and physics and astronomy. He is also the leader of the Institute for Biocomplexity and Informatics (IBI)
which conducts leading-edge interdisciplinary research in systems biology.

Dr. Kauffman is also an emeritus professor of biochemistry at the University of Pennsylvania, a MacArthur Fellow and an external professor at the Santa Fe Institute. He is the author of The Origins of Order, At Home in the Universe: The Search for the Laws of Self-Organization, and Investigations.

Stuart A. Kauffman's Edge Bio Page

The Reality Club: Jaron Lanier


BEYOND REDUCTIONISM: REINVENTING THE SACRED

A great divide splits contemporary society between those who believe in a transcendent God, and those, including myself, who do not. In the West, and now throughout the world, the massive advances of science since Galileo and Newton have given birth to secular society. In the Christian and Jewish segments of the Abrahamic religions, the theistic God who intervened in the affairs of the world gave way in the Enlightenment to a Deistic God who wound up the universe, set the initial conditions, and allowed Newton's laws to carry on. This God no longer entered into the affairs of man. In the theistic tradition, God became either the God of the gaps, where science had yet to hold sway, or, contrary to science, God intervened in the running of the cosmos.

In the West, those who hold to a view of a theistic God, including the Christian fundamentalists of such power in the United States, find themselves in a cultural war with those who do not believe in a transcendent God, whether agnostic or atheistic. This war is evidenced by the fierce battle over Intelligent Design being waged politically and in the court systems of the United States. While the battleground is Darwinism, the deeply emotional issues are more fundamental. These include the belief of many religious people that without God's authority, morality has no basis. Literally, for those in the West who hold to these views, part of the passion underlying religious conviction is the fear that the very foundations of Western society will tumble if faith in a transcendent God is not upheld.

The majority of the Abrahamic peoples are Muslims. I know the Islamic world poorly, but believe that their fundamentalism again in part lies in these moral issues.

Beyond that, reductionism, wrought by the successes of Galileo, Newton, Einstein, Planck, and Schrodinger, and all that has followed, preeminently in physics, has, as I will expand upon in a moment, left us in world of fact — cold fact with no scientific place for value. "The more we know of the cosmos, the more meaningless it appears", said Stephen Weinberg in Dreams of a Final Theory. For example, Wolfgang Kohler, one of the founders of Gestalt psychology, wrote a mid 20th century book entitled hopefully: The Place of Value in a World of Fact. And just a few days ago, a conversation with a humanist professor at the University of Pennsylvania astonished me with her account of how we are again a meaningless world in the post modern world view rampant in the North American humanities.

On the other side of this vast divide than those who hold to a transcendent God and His authority for meaning and values, are the innumerable secular humanists, children of the enlightenment and contemporary science, who hold firmly to reality as revealed by science, find values in their love for their families and friends, a general sense of fairness and a morality that needs no basis in God's word. Yet we secular humanists have paid an unspoken price for our firm sense that (reductionist) science tells us what is real. First, we have no well wrought scientific basis for our humanity — despite the interesting fact that quantum mechanics on the Copenhagen interpretation assumes free willed physicists who choose what quantum features to measure and thereby change the physical world. The two cultures, science and humanities, remain firmly un-united. And equally important, we have been subtly robbed of our deep capacity for spiritualism. We have come to believe that spirituality is inherently co-localized with a belief in God, and that without such a belief, spirituality is inherently foolish, questionable, without foundation, wishful thinking, silly.

In turn, we lack a global ethic to constitute the transnational mythic value structure that can sustain the emerging global civilization. We tend to believe in the value of democracy and the free market. We are largely reduced to consumers. Here it is telling that Kenneth Arrow, brilliant Nobel Laureate in economics and friend, took part in a commission to "place a value" on preservation of National Parks and was stymied in his attempt to find a way to calculate that value based on utility to citizens. Thus, even in our enjoyment of the wild, we are reduced to consumers in our currant Weltanschauung.

Two fine authors, Richard Dawkins and Daniel Dennett, have written recent books, The God Delusion and Breaking the Spell arguing against religion. Their views are based on contemporary science. But the largest convictions of contemporary science remain based on reductionism.

I would like to begin a discussion about the first glimmerings of a new scientific world view — beyond reductionism to emergence and radical creativity in the biosphere and human world. This emerging view finds a natural scientific place for value and ethics, and places us as co-creators of the enormous web of emerging complexity that is the evolving biosphere and human economics and culture. In this scientific world view, we can ask: Is it more astonishing that a God created all that exists in six days, or that the natural processes of the creative universe have yielded galaxies, chemistry, life, agency, meaning, value, consciousness, culture without a Creator. In my mind and heart, the overwhelming answer is that the truth as best we know it, that all arose with no Creator agent, all on its wondrous own, is so awesome and stunning that it is God enough for me and I hope much of humankind.

Thus, beyond the new science that glimmers a new world view, we have a new view of God, not as transcendent, not as an agent, but as the very creativity of the universe itself. This God brings with it a sense of oneness, unity, with all of life, and our planet — it expands our consciousness and naturally seems to lead to an enhanced potential global ethic of wonder, awe, responsibility within the bounded limits of our capacity, for all of life and its home, the Earth, and beyond as we explore the Solar System.

Reductionism

Like any other world view, reductionism is hard to pin down. The modern world view of reductionism clearly grows from the success of modern physics, but finds its roots in ancient Greek philosophy, that all is made of earth, air, fire, and water, or from atoms. Roughly, reductionism is the view that, as Nobel Laureate Stephen Weinberg eloquently puts it, the "explanatory arrows always point downward", from society to small groups to individuals to organs to cells to chemistry to physics and ultimately to something like Weinberg's  Dreams of a Final Theory, a single set of laws, elegant in their form, like General Relativity, which, in Weinberg's sense, explains all. A large majority of contemporary scientists are reductionists. If pressed, most would roughly say that the behavior of complex wholes is nothing more that the laws governing the behaviors of the parts and their interactions. An example well known in physics is the purported successful reduction of classical thermodynamics to statistical mechanics. Here temperature is equated with the mean kinetic energy of particles, pressure with the energy transfer to bounding walls, and the famous second law of thermodynamics is equated with a "flow" of an isolated thermodynamic system from less to more probable macrostates. I have used the caveat "purported" because — an issue too technical to go into here — the reduction requires the truth of the "ergodic hypothesis" and there is some evidence that it might be false.

With reductionism comes the conviction that a court proceeding to try a man for murder is "really" nothing but the movement of atoms, electrons, and other particles in space, quantum and classical events, and ultimately to be explained by, say, string theory.

Beyond Reductionism

We begin with the growing doubt among many physicists themselves that reductionism itself suffices. Nobel Laureate Philip Anderson wrote a famous article, "More is Different", some decades ago, arguing that reductionism is wonderful, but not enough. A computer computing a complex algorithm can be made of transistors or water buckets — it is able to run on multiple physical platforms. Hence reducing the computer to any particular physical basis is insufficient to explain the computer. The drift away from reductionism among physicists is most pronounced among solid state physicists, who deal with such things as metals, glasses, spin glasses, and systems with many "broken symmetries". Robert Laughlin, solid state physicist and Nobel laureate, argues strenuously against the full efficacy of reductionism in A Different Universe. The physicists who hold out for a firm reductionism are, like Weinberg himself, largely high energy particle physicists, seeking that final theory — say string theory.

But it is precisely in the province of string theory itself, that doubts are arising. The early hope was that a single string theory would be found that would explain quantum gravity and all the known particles and forces. Such a single string theory would be the answer to Weinberg's dream of a final theory. But at present, it appears that there are as many as 10 to the 500th power string theories. Hope for a single theory is fast fading and a number of high energy physicists are abandoning reductionism in the sense of finding such a single theory. Thus, Leonard Susskind, in the Cosmic Landscape, suggests a multiverse of "pocket universes", each with a randomly chosen string theory, and a landscape over these "pocket universes" with respect to those whose laws are life friendly. As a critical side note, part of Susskind's move is an attempt to explain the roughly 23 physical constants in physics like the speed of light, the ratio of electron to proton mass, and so on. No one knows where these constants come from or how to explain them. Weinberg himself uttered the "A" word — anthropic. According to this idea, there are many universes, and only those with constants that support the evolution of intelligent life would have such life to wonder at the values of the constants.

In short, many, but not all physicists, are giving up on the adequacy of reductionism alone as a scientific principle to explain the properties of the world. In its stead a new scientific world view is just starting to come into view: Emergence.

Emergence

Roughly speaking emergence breaks into two sub-views, epistemological and ontological emergence. The former says that complex systems are too complex to be explained by reductionistic practices, but that ontologically, reductionism holds. The ontological view is that new entities with their own properties and causal powers arise and are part of the furniture of the universe. I hold strongly to this view and will present a number of cases that appear to support it.

1) The origin of life and its non-reducibility to physics. We do not, in fact, know how or where life started, although most scientists believe that life on earth started on earth some 3.8 billion years ago, shortly after the planet cooled enough for liquid water to form. As an alternative, life might have started elsewhere and arrived here through space, Crick's Panspermia concept.

There are several alternative views about how life emerged on earth, none established. In short summary: The first view notes the remarkable properties of the DNA and RNA double helix, and hopes that a single strand of RNA can serve as a template primer to add A, U, C and G nucleotides to Watson Crick match those of the template and be ligated into proper 3'-5' phosophodiester bonds to replicate the template, then the two strands melt apart, and cycle again. Forty years of hard work have not succeeded for good chemical reasons. Most now doubt that life started this way. The second view is the "RNA World" view. It was discovered that RNA molecules can not only carry genetic information, but act as enzymes, speeding chemical reactions. Work is underway to create an RNA enzyme, or ribozyme, that can copy any RNA molecule including itself. The probability that an RNA molecule can catalyze a given reaction is roughly 10 divided by 10 raised to the 15th power. It is conceivable that such a molecule can arise by chance, but it faces the difficulty that were it to copy itself and make errors, those error copies would be more error prone than the initial copy, and a run away error catastrophe might ensue.

In short, such a molecule might not be stable in evolution. The third view is the "lipid" view, in which hollow spheres of bilayered lipids, called liposomes, can grow and divide. This has been demonstrated experimentally. It may plausibly be part of the origin of life. The fourth view is my own and that of Freeman Dyson, and may also be part of the origin of life. I noted that cellular life is based on collective autocatalysis, where catalysis is the speeding up of a chemical reaction. Thus imagine two polymers, A and B, where each catalyzes the formation of the other out of fragments of the other. That is collective autocatalysis. No molecule catalyzes its own formation, rather the set as a whole is collectively autocatalytic, and achieves catalytic closure. Cells are collectively autocatalytic today. Reza Ghadiri has made collectively autocatalytic small protein systems, and Gunter von Kiederowski has made collectively autocatalytic DNA systems. Thus self reproduction of polymers has been achieved experimentally by good chemists in a lab.

My own theory starts with stating this as a possibility then goes on the ask whether, in a large set of polymers that can act as substrates and products of reactions and also act as catalysts of those very reactions, one would expect such autocatalytic sets to arise "spontaneously". Strikingly, the answer can be yes, depending upon the ratio of reactions among the polymers in the system to the polymer diversity itself, and the distribution of catalytic capacities for those reactions among the same set of polymers. In simple models, as the diversity of polymers increases, so many reactions are catalyzed that autocatalytic sets form spontaneously with high probability. This part of the theory remains to be tested, but can by use of libraries of random DNA, RNA and proteins. The fifth view is metabolism first. Morowitz believes that metabolism can form autocatalytic cycles on its own, and indeed it does, and that metabolism and autocatalysis arose first. 

Clearly none of the theories above is adequate. But one gets the firm sense that science is moving in on possible routes to the origin of life on earth. If some combination of the metabolism, polymer autocatalysis and lipid first view can be formulated and tested in a new "Systems Chemistry", we may find the answers we seek.

Suppose we do. It will be a scientific triumph of course. But if such self reproducing and, via heritable variations, evolving systems are formed, are they ontologically emergent with respect to physics? I believe the answer is yes. Darwin taught us about natural selection and evolution. He did not know the basis for self reproduction or heritable variation. But given these, evolution by natural selection follows. Such evolving life forms would be subject to Darwin's law, which arises only for entities capable of self reproduction and heritable variation. This seems clearly to be ontological emergence, not reducible to physics. Like Anderson's computer able to run on transisitors or buckets of water, Darwin's natural selection can run on multiple physical platforms, where the entities under selection have their own causal powers, and natural selection cannot be reduced to any specific physical platform.

Indeed, it is possible that minor changes in the constants of the physicists would still yield universes in which life, heritable variation and natural selection would obtain. Note that while the physicist might deduce that a specific set of molecules was self reproducing, and had heritable variations and instantiated natural selection, one cannot deduce natural selection from the specific physics of any specific case(s), or even this universe, alone. In short, Darwin's natural selection is a new law operating on the level of self reproducing entities with heritable variation, regardless of the physical underpinning. In contrast to Weinberg's claim, here the explanatory arrows point upward from molecules to the evolution of living systems of molecules via natural selection.

2) Agency: You are now reading this article, presumably on purpose. You are able to act on your own behalf. You are the clearest example we have of agency. It is utterly remarkable that agency has arisen in the universe — systems that are able to act on their own behalf. Systems that modify the universe on their own behalf. Out of agency comes value and meaning. This article either is, or is not interesting to you, hence is or is not valuable. It may change your world view, hence have deep meaning.

It becomes interesting to ask what the minimal physical system is that can act as an agent. In Investigations, I sought to answer this, by proposing that a minimal molecular agent is a system which can reproduce itself and carry out at least one work cycle in the thermodynamic sense. I will not go into the ramifications of this, which are puzzling and I hope important. On this account, a bacterium, swimming up a glucose gradient, and performing work cycles, is an agent, and glucose has value and meaning for the bacterium, without assuming consciousness.

Of course it is natural selection that has achieved this coupling. But teleological language has to start somewhere, and I am willing to place it at the start of life. Either here, or later in the evolutionary pathways, meaning and value arise in the biosphere. They too are ontologically emergent. We have a natural place for value in a world of fact, for the world is not just fact: agents act on the world and actions are not just facts, for the action itself is a subset of the causal consequences of what occurs during an act, and that relevant subset cannot be deduced from physics.

3) We are, in fact, conscious. That is, we have experiences of the world. The philosophers call these "qualia". For years, philosophers of mind have tried to argue that such experiences are "ghosts in the machine". This is just false.

We are, in fact, conscious. Whatever explains consciousness, it is clearly ontologically emergent. There are three radically different views on the cause of consciousness, none known to be true.

The first in the West, is that mind derives from direct connection to the mind of God — St. Augustines view, and to my astonishment, not far from that of Schrodinger, one of the inventors of quantum mechanics. In Tibetan Buddhism, consciousness is continuous, and thus underwrites reincarnation. The second, predominant view among cognitive scientists is that consciousness arises when enough computational elements are networked together. In this view, a mind is a machine, and a complex set of buckets of water pouring water into one another would become conscious. I just cannot believe this. I cannot however disprove it, but I can offer arguments against it.

On this view, the mind is algorithmic. With Penrose, in The Emperor's New Mind, I believe that the mind is not algorithmic, although it can act algorithmically. If it is not algorithmic, then the mind is not a machine and consciousness may not arise in a classical — as opposed to possibly to a quantum — system. Penrose bases his argument on the claim that in seeking a proof a mathematician does not follow an algorithm himself. I think he is right, but the example is not felicitous, for the proof itself is patently an algorithm, and how do we know that the mathematician did not subconsciously follow that algorithm in finding the proof.

My arguments start from humbler conditions. Years ago my computer sat on my front table, plugged into a floor socket. I feared my family would bump into the cord and pull the computer off the table, breaking it. I now describe the table: 3 x 5 feet, three wooden boards on top, legs with certain carvings, chipped paint with the wood surface showing through with indefinitely many distances between points on the chipped flecks,  two cracks, one crack seven feet from the fireplace, eleven feet from the kitchen, 365,000 miles from the moon, a broken leaf on the mid board of the top…..You get the idea that there is no finite description of the table — assuming for example continuous spacetime.

So I invented a solution. I jammed the cord into one of the cracks and pulled it tight so that my family would not be able to pull the computer off the table. Now it seems to me that there is no way to turn this Herculian mental performance into an algorithm. How would one bound the features of the situation finitely?  How would one even list the features of the table in a denumerably infinite list? One cannot.  Thus it seems to me that no algorithm was performed. As a broader case, we are all familiar with struggling to formulate a problem. Do you remotely think that your struggle is an effective "mechanical" or algorithmic procedure? I do not. I also do not know how to prove that a given performance is not algorithmic. What would count as such a proof?  So I must leave my conviction with you, unproven, but powerful I think. If true, then the mind is not a machine.

The third view of mind and consciousness, which I tentatively favor, is that it is related to quantum behavior. The standard physicist's answer is that quantum effects cannot occur at body temperature. Indeed, Schrodinger says this, then says of consciousness, "I am become God". However, recent theorems in quantum computing, and facts about cells cast doubt on this conclusion. The theorems show that, if measurements are made and work is done on a quantum computer, its qubits can remain "quantum coherent" when they should "decohere" towards classical behavior. Thus, if work is done on a system, parts of it may remain quantum coherent at body temperature in principle.

But cells do thermodynamic work and might be able to carry out such measurements and work to maintain some variables quantum coherent. Second, cells are crowded by proteins and other molecules, and the water between these molecules is largely ordered, not like an ordinary liquid. This may permit quantum coherence physically in cells. No one knows. It seems worth investigation in its own right. Meanwhile, my approximate theory is that mind is acausal, quantum mechanics is acausal on the familiar Born interpretation of the Schrodinger equation, (to the grief of Einstein), that consciousness is due to a special state where a system is persistently poised between quantum and classical behavior, that the emergence of classical behavior in the mind-brain system, perhaps by decoherence, is the "mind making something actual" happen in the physical world, and — big jump — that consciousness itself consists in this quantum coherent state as lived by the organism. This is a long jump, but not impossible. I don't even think it is stupider than other theories of consciousness, and may be true. Whatever the case, consciousness is ontologically emergent in this universe.

The Biosphere and Human Culture are Ceaselessly Creative in Ways that Cannot be Foretold.

The third, rather astonishing theme that is emerging in this new world view is that the biosphere and human culture are ceaselessly creative in ways that are fundamentally unpredictable and presumably non-algorithmic or machine like.

I begin with Darwinian adaptations and preadaptations. Were one to ask Darwin what the function of the heart is, he would have replied, "To pump blood". That is, the causal consequence of the heart for virtue of which it was selected by natural selection is pumping blood. But the heart makes heart sounds. These are not the function of the heart. Thus, the function of the heart is a subset of its causal consequences and must be analyzed in the context of the whole organism in its selective environment. Again this says that biology cannot be reduced to physics, for while the string theorist might (actually could not) deduce all the properties of a given heart, he/she would have no way to pick out as the relevant property that of pumping blood. But it is that property that accounts for the existence of hearts in the biosphere.

Now a Darwinian preadaptation is a causal consequence of a part of an organism of no selective significance in the normal environment, but which might be of use in some odd environment, hence become the subject of natural selection. Here the organ was "preadapted" for this novel function in the biosphere. A fanciful example concerns the squirrel Gertrude who happened to have a single Mendelian dominant mutant that gave her flaps of skin from arms to legs on both sides. (Darwinian preadaptations need not rely on new mutations in general, but I use them for my friend Gertrude, who lived 65,394,003 years ago in Guatemala.) Gertrude was so ugly that the rest of the squirrels would not play or eat with her. She was in a magnolia tree eating lunch sadly and alone when Bertha, an early owl in a neighboring pine, spied Gertrude, thought "Lunch", dived towards Gertrude horrid claws extended…..Gertrude was terrified. Suddenly she jumped from the tree, arms and legs flung wide. "Ghaaaa!" cried Gertrude, then looked, incredulous, as she flew. And she escaped the befuddled Bertha. Well, Gertrude became a heroine in her clan, was married in a lovely civil ceremony to a handsome squirrel not a month later, and thanks to her dominant mutation, all their children had similar flaps of skin. And that is how flying squirrels came to exist in the biosphere. I like Gertrude a lot.

It is critical that virtually any extant feature of an organism can become the subject of natural selection in the appropriate environment, and typically, if selected, a novel functionality arises in the biosphere and universe. Now the critical question: Do you think you could say ahead of time, or finitely prestate, all possible Darwinian preadaptations of, say species alive now, or even humans? I have not found anyone who thought the answer was yes. I do not know how to prove my claim that the answer is "No", but part of the problem is that we cannot finitely prestate the relevant features of all possible selective environments for all organisms with respect to all their features.

But the failure to prestate the possible preadaptations is not slowing down the evolution of the biosphere where preadaptations are widely known. Thus, ever novel functionalities come to exist and proliferate in the biosphere. The fact that we cannot prestate them is essential, and an essential limitation to the way Newton taught us to do science: Prestate the relevant variables, forces acting among them, initial and boundary conditions, and calculate the future evolution of the system…say projectile. But we cannot prestate the relevant causal features of organisms in the biosphere. We do not know now the relevant variables! Thus we cannot write down a set of equations for the temporal evolution of these variables. We are profoundly precluded from the Newtonian move. In short, the evolution of the biosphere is radically unknowable, not due to quantum throws of the dice, or deterministic chaos, but because we cannot prestate the macroscopic relevant features of organisms and environments that will lead to the emergence of novel functions in the biosphere with their own causal properties that in turn alter the future evolution of the biosphere. Thus, the evolution of the biosphere is radically creative, ceaselessly creative, in way that cannot be foretold. I find this wonderful.

I believe this fact means that the evolution of the biosphere is non-algorithmic. It cannot be simulated, certainly with continuous spacetime and quantum mechanics playing a role.

The same Darwinian preadaptations occur in the evolution of the economy. The story concerns engineers trying to invent the tractor. They would need a massive engine block. They tried it on chasse after chasse, all of which broke. Finally one of the engineers said, "The engine block itself is so massive and rigid that we can use the engine block itself as the chasse." And that is how tractors are made. Now the rigidity of the tractor was a Darwinian preadaptation, a causal feature useful for a new function. Its discovery was a true invention. But this means that the technological evolution of the econosphere is also not finitely prestatable, nor presumably algorithmic. It too is ceaselessly creative, expanding from some 1000 goods and services say 50,000 years ago to perhaps 10 billion today.

And human culture, in general, is ceaselessly creative as the biosphere and culture expand into what I call the Adjacent Possible. Here the point is that, at levels of complexity above the atom, the universe has not had time to make all possible complex objects, such as all proteins length 200. The universe, at these levels of complexity, is on a unique trajectory. So when my friend Gertrude flew, she changed the material and behavioral features of the evolving universe. So did Picasso.

In short, in wondrous ways, these our universe, biosphere, econosphere, and culture are ceaselessly creative and emergent. The two cultures, science and humanities, stand united in this world view. Meaning and value have a scientific base. And ethics? At a recent meeting on science and religion on Star Island, we heard more than one lecture on animal emotions and the sense of fairness in chimpanzees. Group selection, we were told, is now making its way into evolutionary biology. With it, natural selection can get its grip on behaviors that are advantageous to the group, like fairness, so it emerges. Far from evolution being anathema to ethics, evolution is the first source of human morality. But not the last, for we can argue whether we should want what we want.

God and a Global Ethic

God is the most powerful symbol we have created. The Spaniards in the New World built their churches on the holy sites of those they vanquished. Notre Dame sits on a Druid holy site. Shall we use the God word? It is our choice. Mine is a tentative "yes". I want God to mean the vast ceaseless creativity of the only universe we know of, ours. What do we gain by using the God word? I suspect a great deal, for the word carries with it awe and reverence. If we can transfer that awe and reverence, not to the transcendental Abrahamic God of my Israelite tribe long ago, but to the stunning reality that confronts us, we will grant permission for a renewed spirituality, and awe, reverence and responsibility for all that lives, for the planet.

Does one know that such a transformation of human sensibilities will happen? Of course not. But the sense of justice matured in the Abrahamic tradition from 10 eyes for an eye, to an eye for an eye, to love thine enemy as thyself. Then can a heightened consciousness bring about a global ethic? I believe so. I believe, I hope correctly, that what I have sketched above is true, points to a new vision of our co-creating reality, that it invites precisely an enhancement of our sense of spirituality, reverence, wonder, and responsibility, and can form the basis of a trans-national mythic structure for an emerging global civilization.

Co-Evolving Traditions

To ever succeed, this new view needs to be soft spoken. You see, we can say, here is reality, is it not worthy of stunned wonder? What more could we want of a God? Yes, we give up a God who intervenes on our behalf. We give up heaven and hell. But we gain ourselves, responsibility, and maturity of spirit. I know that saying that ethics derives from evolution undercuts the authority of God as its source. But do we need such a God now? I think not. Nor do we need the spiritual wasteland that post-modernism has brought us. Beyond my admired friend Kenneth Arrow, natural parks are valuable because life is valuable on its own, a wonder of emergence, evolution and creativity. Reality is truly stunning. So if you find this useful, let us go forth, as was said long ago, and invite consideration by others of this new vision of reality. With it, let us recreate spiritual community and membership.  Let us go forth. Civilization needs to be changed.


References:

Anderson, P. 1972. More is different. Science 177: 393–396.
Dawkins, R. 2006. The God Delusion. Houghton Mifflin, Boston.
Dennett, D. 2006. Breaking the Spell: Religion as a Natural Phenomenon. Viking Adult, NYC.
Laughlin. 2005. A Different Universe: Reinventing Physics from the Bottom Down. Basic Books, NYC.
Kauffman, S. 2000. Investigations. Oxford University Press, Oxford.
Schrödinger, E. 2946. What is Life? Macmillan, NYC.
Susskind, L. 2006. The Cosmic Landscape: String Theory and the Illusion of Intelligent Design. Little, Brown and Company, NYC.
Weinberg, S. 1992. Dreams of  a Final Theory. Vintage Books, NYC.



MY GOD PROBLEM
By Natalie Angier

Introduction

In a talk in London a few months ago, Ian McEwan noted that looking back at the mid-70s, "none of us ...would have thought [that] we'd be devoting so much mental space now to confront religion. We thought that matter had long been closed." Indeed, earlier this year sixteen scientists, all Edge contributors, dropped everything to write authoritative essays for a book published on a crash schedule to rebut the hoax known as "Intelligent Design". One of the most perceptive comments about the book, Intelligent Thought: Science versus the Intelligent Design Movement, was in an Orlando Weekly review:

The worst kind of argument to have, is one with someone who Just Doesn't Get It. The debates that find your well-reasoned points countered with the tautological equivalent of "nuh-uh" or "because, that's why" may not make you feel like you lost the argument, but you certainly don't feel like you won, either. Especially when the topic you're disagreeing on isn't even something that should be up for debate. ...

That's the overriding sense one suspects the writers of the essays in Intelligent Thought were experiencing when they put pen to paper. More than one of them, I'm sure, muttered to himself: "I can't fucking believe I'm having to write this".

("Science vs. Stupid" by Jason Ferguson)

We did not spend the time and take the trouble to publish the book to convert religious fundamentalists to science-based thinking, but simply to have a place marker to sit on the desks of educators, of newspaper and magazine editors, of politicians, which could, at a minimum, serve as a talisman for rational thinking and ideas. In a nation where, during a single week, the President of the United States, the Majority Leader of the Senate, the leading candidate (Senator John McCain) for the Republican nominee for the presidential election in 2008, all endorsed teaching Christian fundamentalist religious dogma in public school science classes, business as usual is off the table.

And now, more than ever, it's time for extreme voices.

Natalie Angier, the Pulitzer prize-winning New York Times science journalist understands this. It was her ringing endorsement of atheism in her widely discussed 2004 review of Sam Harris's first book, The End of Faith, in The New York Times Book Review (see "'The End of Faith': Against Toleration") that, in part, set the stage for the current conversation about the recent books of Daniel C. Dennett, Richard Dawkins, and Sam Harris. This resulted in something unthinkable just a few years ago: the two leading national news magazines, Time and Newsweek, both running cover stories during the same week presenting the ideas of leading atheists to the American reading public.

"There is something to be said for a revival of pagan peevishness and outspokenness," Angier wrote in 2001. "It's not that I would presume to do something as foolish and insulting as try to convert a believer. Arguments over the question of whether God exists are ancient, recurring, sometimes stimulating but more often tedious. Arrogance and righteousness are nondenominational vices that entice the churched and unchurched alike. ... Still, the current climate of religiosity can be stifling to nonbelievers, and it helps now and then to cry foul. "

Read on.

NATALIE ANGIER won the Pulitzer Prize for beat reporting as a science writer for The New York Times. She is the author of Natural Obsessions,The Beauty of the Beastly, Woman: An Intimate Geography, and the forthcoming The Canon: A Whirligig Tour of the Beautiful Basics of Science.

Natalie Angier 's Edge Bio Page


MY GOD PROBLEM

In the course of reporting a book on the scientific canon and pestering hundreds of researchers at the nation's great universities about what they see as the essential vitamins and minerals of literacy in their particular disciplines, I have been hammered into a kind of twinkle-eyed cartoon coma by one recurring message. Whether they are biologists, geologists, physicists, chemists, astronomers, or engineers, virtually all my sources topped their list of what they wish people understood about science with a plug for Darwin's dandy idea. Would you please tell the public, they implored, that evolution is for real? Would you please explain that the evidence for it is overwhelming and that an appreciation of evolution serves as the bedrock of our understanding of all life on this planet?

In other words, the scientists wanted me to do my bit to help fix the terrible little statistic they keep hearing about, the one indicating that many more Americans believe in angels, devils, and poltergeists than in evolution. According to recent polls, about 82 percent are convinced of the reality of heaven (and 63 percent think they're headed there after death); 51 percent believe in ghosts; but only 28 percent are swayed by the theory of evolution.

Scientists think this is terrible—the public's bizarre underappreciation of one of science's great and unshakable discoveries, how we and all we see came to be—and they're right. Yet I can't help feeling tetchy about the limits most of them put on their complaints. You see, they want to augment this particular figure—the number of people who believe in evolution—without bothering to confront a few other salient statistics that pollsters have revealed about America's religious cosmogony. Few scientists, for example, worry about the 77 percent of Americans who insist that Jesus was born to a virgin, an act of parthenogenesis that defies everything we know about mammalian genetics and reproduction. Nor do the researchers wring their hands over the 80 percent who believe in the resurrection of Jesus, the laws of thermodynamics be damned.

No, most scientists are not interested in taking on any of the mighty cornerstones of Christianity. They complain about irrational thinking, they despise creationist "science," they roll their eyes over America's infatuation with astrology, telekinesis, spoon bending, reincarnation, and UFOs, but toward the bulk of the magic acts that have won the imprimatur of inclusion in the Bible, they are tolerant, respectful, big of tent. Indeed, many are quick to point out that the Catholic Church has endorsed the theory of evolution and that it sees no conflict between a belief in God and the divinity of Jesus and the notion of evolution by natural selection. If the pope is buying it, the reason for most Americans' resistance to evolution must have less to do with religion than with a lousy advertising campaign.

So, on the issue of mainstream monotheistic religions and the irrationality behind many of religion's core tenets, scientists often set aside their skewers, their snark, and their impatient demand for proof, and instead don the calming cardigan of a a kiddie-show host on public television. They reassure the public that religion and science are not at odds with one another, but rather that they represent separate "magisteria," in the words of the formerly alive and even more formerly scrappy Stephen Jay Gould. Nobody is going to ask people to give up their faith, their belief in an everlasting soul accompanied by an immortal memory of every soccer game their kids won, every moment they spent playing fetch with the dog. Nobody is going to mock you for your religious beliefs. Well, we might if you base your life decisions on the advice of a Ouija board; but if you want to believe that someday you'll be seated at a celestial banquet with your long-dead father to your right and Jane Austen to your left-and that she'll want to talk to you for another hundred million years or more—that's your private reliquary, and we're not here to jimmy the lock.

Consider the very different treatments accorded two questions presented to Cornell University's "Ask an Astronomer" Web site. To the query, "Do most astronomers believe in God, based on the available evidence?" the astronomer Dave Rothstein replies that, in his opinion, "modern science leaves plenty of room for the existence of God . . . places where people who do believe in God can fit their beliefs in the scientific framework without creating any contradictions." He cites the Big Bang as offering solace to those who want to believe in a Genesis equivalent and the probabilistic realms of quantum mechanics as raising the possibility of "God intervening every time a measurement occurs" before concluding that, ultimately, science can never prove or disprove the existence of a god, and religious belief doesn't—and shouldn't—"have anything to do with scientific reasoning."

How much less velveteen is the response to the reader asking whether astronomers believe in astrology. "No, astronomers do not believe in astrology," snarls Dave Kornreich. "It is considered to be a ludicrous scam. There is no evidence that it works, and plenty of evidence to the contrary." Dr. Kornreich ends his dismissal with the assertion that in science "one does not need a reason not to believe in something." Skepticism is "the default position" and "one requires proof if one is to be convinced of something's existence."

In other words, for horoscope fans, the burden of proof is entirely on them, the poor gullible gits; while for the multitudes who believe that, in one way or another, a divine intelligence guides the path of every leaping lepton, there is no demand for evidence, no skepticism to surmount, no need to worry. You, the religious believer, may well find subtle support for your faith in recent discoveries—that is, if you're willing to upgrade your metaphors and definitions as the latest data demand, seek out new niches of ignorance or ambiguity to fill with the goose down of faith, and accept that, certain passages of the Old Testament notwithstanding, the world is very old, not everything in nature was made in a week, and (can you turn up the mike here, please?) Evolution Happens.

And if you don't find substantiation for your preferred divinity or your most cherished rendering of the afterlife somewhere in the sprawling emporium of science, that's fine, too. No need to lose faith when you were looking in the wrong place to begin with. Science can't tell you whether God exists or where you go when you die. Science cannot definitively rule out the heaven option, with its helium balloons and Breck hair for all. Science in no way wants to be associated with terrifying thoughts, like the possibility that the pericentury of consciousness granted you by the convoluted, gelatinous, and transient organ in your skull just may be the whole story of you-dom. Science isn't arrogant. Science trades in the observable universe and testable hypotheses. Religion gets the midnight panic fêtes. But you've heard about evolution, right?

So why is it that most scientists avoid criticizing religion even as they decry the supernatural mind-set? For starters, some researchers are themselves traditionally devout, keeping a kosher kitchen or taking Communion each Sunday. I admit I'm surprised whenever I encounter a religious scientist. How can a bench-hazed Ph. D., who might in an afternoon deftly purée a colleague's PowerPoint presentation on the nematode genome into so much fish chow, then go home, read in a two-thousand-year-old chronicle, riddled with internal contradictions, of a meta-Nobel discovery like "Resurrection from the Dead," and say, gee, that sounds convincing? Doesn't the good doctor wonder what the control group looked like?

Scientists, however, are a far less religious lot than the American population, and, the higher you go on the cerebro-magisterium, the greater the proportion of atheists, agnostics, and assorted other paganites. According to a 1998 survey published in Nature, only 7 percent of members of the prestigious National Academy of Sciences professed a belief in a "personal God." (Interestingly, a slightly higher number, 7.9 percent, claimed to believe in "personal immortality," which may say as much about the robustness of the scientific ego as about anything else.) In other words, more than 90 percent of our elite scientists are unlikely to pray for divine favoritism, no matter how badly they want to beat a competitor to publication. Yet only a flaskful of the faithless have put their nonbelief on record or publicly criticized religion, the notable and voluble exceptions being Richard Dawkins of Oxford University and

Daniel Dennett of Tufts University. Nor have Dawkins and Dennett earned much good will among their colleagues for their anticlerical views; one astronomer I spoke with said of Dawkins, "He's a really fine parish preacher of the fire-and-brimstone school, isn't he?"

So, what keeps most scientists quiet about religion? It's probably something close to that trusty old limbic reflex called "an instinct for self-preservation." For centuries, science has survived quite nicely by cultivating an image of reserve and objectivity, of being above religion, politics, business, table manners. Scientists want to be left alone to do their work, dazzle their peers, and hire grad students to wash the glassware. When it comes to extramural combat, scientists choose their crusades cautiously. Going after Uri Geller or the Ra‘lians is risk-free entertainment, easier than making fun of the sociology department. Battling the creationist camp has been a much harder and nastier fight, but those scientists who have taken it on feel they have a direct stake in the debate and are entitled to wage it, since the creationists, and more recently the promoters of "intelligent design" theory, claim to be as scientific in their methodology as are the scientists.

But when a teenager named Darrell Lambert was chucked out of the Boy Scouts for being an atheist, scientists suddenly remembered all those gels they had to run and dark matter they had to chase, and they kept quiet. Lambert had explained the reason why, despite a childhood spent in Bible classes and church youth groups, he had become an atheist. He took biology in ninth grade, and, rather than devoting himself to studying the bra outline of the girl sitting in front of him, he actually learned some biology. And what he learned in biology persuaded him that the Bible was full of . . . short stories. Some good, some inspiring, some even racy, but fiction nonetheless. For his incisive, reasoned, scientific look at life, and for refusing to cook the data and simply lie to the Boy Scouts about his thoughts on God—as some advised him to do—Darrell Lambert should have earned a standing ovation from the entire scientific community. Instead, he had to settle for an interview with Connie Chung, right after a report on the Gambino family.

Scientists have ample cause to feel they must avoid being viewed as irreligious, a prionic life-form bent on destroying the most sacred heifer in America. After all, academic researchers graze on taxpayer pastures. If they pay the slightest attention to the news, they've surely noticed the escalating readiness of conservative politicians and an array of highly motivated religious organizations to interfere with the nation's scientific enterprise—altering the consumer information Web site at the National Cancer Institute to make abortion look like a cause of breast cancer, which it is not, or stuffing scientific advisory panels with anti-abortion "faith healers."

Recently, an obscure little club called the Traditional Values Coalition began combing through descriptions of projects supported by the National Institutes of Health and complaining to sympathetic congressmen about those they deemed morally "rotten," most of them studies of sexual behavior and AIDS prevention. The congressmen in turn launched a series of hearings, calling in institute officials to inquire who in the Cotton-pickin' name of Mather cares about the perversions of Native American homosexuals, to which the researchers replied, um, the studies were approved by a panel of scientific experts, and, gee, the Native American community has been underserved and is having a real problem with AIDS these days. Thus far, the projects have escaped being nullified, but the raw display of pious dentition must surely give fright to even the most rakishly freethinking and comfortably tenured professor. It's one thing to monkey with descriptions of Darwinism in a high-school textbook. But to threaten to take away a peer-reviewed grant! That Dan Dennett; he is something of a pompous leafblower, isn't he?

Yet the result of wincing and capitulating is a fresh round of whacks. Now it's not enough for presidential aspirants to make passing reference to their "faith." Now a reporter from Newsweek sees it as his privilege, if not his duty, to demand of Howard Dean, "Do you see Jesus Christ as the son of God and believe in him as the route to salvation and eternal life?" In my personal fairy tale, Dean, who as a doctor fits somewhere in the phylum Scientificus, might have boomed, "Well, with his views on camels and rich people, he sure wouldn't vote Republican!" or maybe, "No, but I hear he has a Mel Gibson complex." Dr. Dean might have talked about patients of his who suffered strokes and lost the very fabric of themselves and how he has seen the centrality of the brain to the sense of being an individual. He might have expressed doubts that the self survives the brain, but, oh yes, life goes on, life is bigger, stronger, and better endowed than any Bush in a jumpsuit, and we are part of the wild, tumbling river of life, our molecules were the molecules of dinosaurs and before that of stars, and this is not Bulfinch mythology, this is corroborated reality.

Alas for my phantasm of fact, Howard Dean, M. D., had no choice but to chime, oh yes, he certainly sees Jesus as the son of God, though he at least dodged the eternal life clause with a humble mumble about his salvation not being up to him.

I may be an atheist, and I may be impressed that, through the stepwise rigor of science, its Spockian eyebrow of doubt always cocked, we have learned so much about the universe. Yet I recognize that, from there to here, and here to there, funny things are everywhere. Why is there so much dark matter and dark energy in the great Out There, and why couldn't cosmologists have given them different enough names so I could keep them straight? Why is there something rather than nothing, and why is so much of it on my desk? Not to mention the abiding mysteries of e-mail, like why I get exponentially more spam every day, nine-tenths of it invitations to enlarge an appendage I don't have.

I recognize that science doesn't have all the answers and doesn't pretend to, and that's one of the things I love about it. But it has a pretty good notion of what's probable or possible, and virgin births and carpenter rebirths just aren't on the list. Is there a divine intelligence, separate from the universe but somehow in charge of the universe, either in its inception or in twiddling its parameters? No evidence. Is the universe itself God? Is the universe aware of itself? We're here. We're aware. Does that make us God? Will my daughter have to attend a Quaker Friends school now?

I don't believe in life after death, but I'd like to believe in life before death. I'd like to think that one of these days we'll leave superstition and delusional thinking and Jerry Falwell behind. Scientists would like that, too. But for now, they like their grants even more.

[First published in The American Scholar 72, no. 2, Spring 2004]



Douglas Rushkoff

To me, the most remarkable result of Renaissance-era portraiture was how it coincided with the rebirth of the notion of an "individual" at all. Just as Augustus's Ancient Greek contemporaries invented the formal definition of the individual in the first place, the Renaissance — as its name implies — "rebirthed" this notion along with new styles and technologies of representation.

In the Renaissance, individuality was not limited to the art patron who could afford to have himself painted, but defined just as powerfully by the spectators to these paintings. The invention of "perspective" painting, itself, conveyed the importance of an individual's point-of-view of a given landscape, and served as a valuable allegory to the emerging notion of individual experience, opinion, and even rights.

And of course, the Renaissance was followed by the Baroque, where the uniqueness of each individual portrait would be even further celebrated by an ornate physical frame, and then the Enlightenment, where the idea of individuality would crescendo with the expression of every man's opinion in the form of a vote.

But, up to this point, the individual member of society-at-large is being defined more by his response than his expression. Right through the age of broadcast media, it is the elite who have the opportunity to get their image in a painting, on a movie screen, or in the TV set, and the vast majority of "individuals" who merely get to have an opinion on that person or image. And then, to voice that opinion through consumption or voting behavior.

As Burda suggests, the advent of interactivity and online publishing changes that equation. Individuality is no longer defined by what media we consume, but by what we produce. Today, the amateur blogger, podcaster, or YouTuber can post his image just as potentially far and wide as a king or media kingpin of ages past. The key distinction, of course, is that "potentially" part. The relative scarcity of imagery in a top-down media universe — from the era of the printing press and portraiture to broadcast television and Warholian silk screen — made it easier to guarantee reaching a wide audience.

Today, the relative scarcity of eyeball-hours compared to media being produced can lead, as Burda poses, to a sense of self-worth dependent on hit counts: "How often do I appear in the media, in which ones, and how often am I quoted?"

While I agree with this part of Burda's assessment, I'd like to suggest that this particular dynamic results from applying a renaissance-era notion of individuality and worth to a very different era. While the original renaissance celebrated the "individual," we may be moving into a cultural era that favors the collective or the network over individuality. No, we don't see a whole lot of evidence for this in the current, adolescent, exhibitionist culture of YouTube. But I do believe it is the logical next step for a generation growing up with fame and individual recognition as such clearly temporary and ethereal phenomena. LonelyGirl14 on MySpace, famous this year for her self-searching videos, will be forgotten by next year and replaced by LonelyGirl15.

And as the millions of former "individuals" reproducing their images online get used to this temporality, their attention will turn instead to the project they are building together. The network itself will become more interesting as a collaborative creation than any particular individual within it — just as museums became more interesting than any of their individual works. And it's then — during our own version of the Baroque era — that we'll find out what mass enlightenment might really be about.


Yossi Vardi

In his essay, Hubert Burda makes a very important and compelling observation which provides a new and better insight for understanding one of the most important driving forces of the Internet. Tens of millions of users are spending time creating content that describes themselves, their preferences, their appearence, their activities. You can find it in blogs, videos, sites, etc.

Burda makes the interesting analogy between self portraiture of earlier times and this sub-segement of self publishing. His insight wraps My Space , YouTube, blogs, and personal profiles under one new category: "self portrayal". We knew that User Generated Content is huge and self publishing is huge. Now we understand that the biggest segement of these two phenomena is Self Portrayal.

The Burda insight can be summed up as follows: "I am self-portraying, therefore I am."


Jaron Lanier

Here is a story I've heard told many times between professional photographers that might not be entirely apocryphal: In the early years of photography, enterprising salesmen are said to have traveled around, especially in the American West, with pre-fabricated portraits of various iconic sorts of people. The macho guy with a beard, the dignified frontier businessman. These would be sold based on their similarity to an individual. So you'd buy a photo of a model who looked vaguely like yourself or a relative. It made economic sense. It was hard to travel around with a darkroom and photographs were wonderful and novel. (Maybe a historian will write in to resolve whether this really happened.)

I think photographers like the story because it suggests that one of the earliest instances of egalitarian photography transcended the function of realistic documentation. It seems to me something similar is going on with the Internet at this time. There are a cluster of portraits surrounding the typical user online, and yet each of them is highly specialized and ritualistic, and actually has very little to do with the person as a whole or as an individual. The underlying environment is the same as in the story about photography: we are experiencing a very early, awkward version of a new media technology.

One of the online pseudo-portraits is the attention-seeking self-portrayal now expected from all of us on Myspace. (I regularly get complaints that I haven't put one up.) Since there are only a few archetypes, ideals or icons to strive for in comparison to the vastnesses of instances of everything online, quirks and idiosyncrasies stand out better than grandeur in this new domain. I imagine Augustus' Myspace page would have pictured him picking his nose.

Then there's the involuntary portraiture of marketing profiles compiled automatically by companies like Google, credit rating agencies, and other firms that make their money by enumerating personal differences. These are portraits of ourselves we often aren't able to see, and many of them wouldn't be readable by a person anyway.

There is, however, an emerging class of digital self-representations that interests me a great deal and gives me hope. Consider Second Life, where people are represented as avatars. I have high hopes for avatars — these are puppets or digital costumes worn in virtual environments by the individuals who design them.

One interesting quality of avatars is that the design space is vast enough that an individual can avoid becoming lost in the clutter of other people's imaginations without an enormous amount of effort. Also, avatars aren't just a digital version of something that came before, like a photograph. This fact frames a rare instance of primal freedom for digital culture. Since there wasn't a flood of classical avatars from the 1960s or any other period, avatars aren't burdened by cultural legacy from the "old media." Although a lot of people on Second Life choose somewhat cliched avatars, particularly that of the prostitute, those who want to stand out are still able to, despite the growth of the site. This is an important experiment. Maybe it is possible to avoid the commoditization of individuality that is common in sites like Myspace.

Douglas Rushkoff's response demonstrates why it is so important that the means of self-portrayal become as good and as rich as possible, as we climb up the fateful ramp of Moore's Law in the sphere of culture and public affairs. In retrospect, the historical periods so vividly described by Burda can be seen as kaleidoscopic structures in which the relatively small number of portraits that could be brought into existence had innumerable additional lives as reflected surrogate portraits, like those pseudo-portraits of the old-West. The further back in time you go, the fewer widely-portrayed people there were, going back to the singular instance of Augustus. But with each development in the art of portraiture over the centuries, the many un-portrayed individuals who comprised the masses were ever more shaped by the iconic or archetypal idea of the individual conveyed by the portraits at the center of the kaleidoscope.

In that last gasp of centralized media which we are experiencing now, it is still possible to watch grown men go into a theater and identify with, well James Bond, Borat, or whoever, but there are a relatively small number of choices. The phenomenon of the hero (or anti-hero) has been encouraged at least in part by technological limitations — whether broadcast by Homeric bards in ancient times or more recently by cable TV — but in all cases from the center of the kaleidoscope.

When the Internet replaces all the mirrors in the kaleidoscope with display screens, there are still as many images. Now each image is actually a portrait of someone, but alas those many portraits have lost their potency. There is no more resonance — only, as Rushkoff and Andy Warhol have said, a skittering progression of attention to transient fragments of what might be people.

Unless self-portrayal gets good enough to sustain the individual against the onslaught of the digital collective, the individual will suffer and perhaps die. This is both an artistic and technological challenge. Rushkoff, it seems, and a number of other edge.org habitues celebrate the digital collective and look forward to it, because for them it provides a sense of destiny — a religion. As everyone around edge.org knows by now, I think the emerging digital collective is more likely to be cruel and idiotic and I prefer to celebrate individuals.

At any rate, the future of digital portraiture —in particular the question of how luscious and vastly varied it can be — will be central to the unfolding contest between individual, heroic people and digital triumphalism, and I am rooting for people.


Jaron Lanier responds to "Reinventing the Sacred " by Stewart Kauffman


Jaron Lanier

Am I getting old and jaded?  I worry that in the matter of fundamental beliefs and worldviews, people as a whole. including me, are not to be trusted.

To be cynical, I could point out that starting a new religion is the biggest business of all — so Stuart Kauffman might make scientists everywhere rich.

The only problem is that people don't consistently have the discipline to avoid the slide into silly superstitions, no matter how carefully their underlying beliefs might have been constructed or vetted — Buddhism, which some claim to be the least dangerous religion, still served as the foundation of the beliefs of a group that released sarin gas in the Tokyo subways — and of course the major supposedly enlightened materialist movements of the 20th century (those referring to Marx) were moral disasters — someone will always show up to promote nonsense for short sighted gains.

The best strategy for us humans is to have an interminable and endlessly confusing ecology of conflicting metaphysical or underlying beliefs — different religions — so that no single belief can become a powerful monoculture, which makes it hopelessly seductive to maniacs.

How did the American founders manage to get that so right, so long ago?

It's important to acknowledge what one believes instead of pretending not to believe in anything, which is almost always a form of self-delusion — in that spirit I'm ready to sign up for the beautiful new spirituality Kauffman is articulating — just so long as I'm not joined by absolutely everyone else, in which case I'd have to switch to believing in something else as a matter of survival for us all.     


|Top|