Edge 194— October 26, 2006
(9,900 words)



FESTIVAL DELLA SCIENZA 2006
Genova, 26 ottobre - 7 novembre

It's that time of year and all roads lead to Genoa and the city-wide Festival della Scienza 2006 which opens today. Edge will be there once again, staging a panel discussion on "The Expanding Third Culture" (see essay below) with Seth Lloyd, Robert Trivers, and Gloria Origgi on Tuesday, October 31st at 3:00 pm. Numerous other Edge contributors will also be present during the two week festival.


Edge Panel at Festival della Scienza, Genoa
The Expanding Third Culture
Seth Lloyd, Gloria Origgi, Robert Trivers; moderator, John Brockman

Tuesday, October 31, 3:00pm
Palazzo Ducale - Sala del Maggior Consiglio
P.zza Matteotti 9
[click here]

[Download a pdf of the Festival della Scienza program]


THE EXPANDING THIRD CULTURE
By John Brockman

Just as science—that is, reliable methods for obtaining knowledge—has encroached on areas formerly considered to belong to the humanities (such as psychology), science is also encroaching on the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is the most accurate way of gaining knowledge about anything, whether it is the human spirit, the role of great men in history, or the structure of DNA. Humanities scholars and historians who spurn it condemn themselves to second-rate status and produce unreliable results.

[...more]


WHY THERE ALMOST CERTAINLY IS NO GOD
By Richard Dawkins

Either Jesus had a father or he didn't. The question is a scientific one, and scientific evidence, if any were available, would be used to settle it. The same is true of any miracle - and the deliberate and intentional creation of the universe would have to have been the mother and father of all miracles. Either it happened or it didn't. It is a fact, one way or the other, and in our state of uncertainty we can put a probability on it - an estimate that may change as more information comes in. Humanity's best estimate of the probability of divine creation dropped steeply in 1859 when The Origin of Species was published, and it has declined steadily during the subsequent decades, as evolution consolidated itself from plausible theory in the nineteenth century to established fact today.

The Chamberlain tactic of snuggling up to 'sensible' religion, in order to present a united front against ('intelligent design') creationists, is fine if your central concern is the battle for evolution. That is a valid central concern, and I salute those who press it, such as Eugenie Scott in Evolution versus Creationism. But if you are concerned with the stupendous scientific question of whether the universe was created by a supernatural intelligence or not, the lines are drawn completely differently. On this larger issue, fundamentalists are united with 'moderate' religion on one side, and I find myself on the other.

[...more]


THE UNIVERSE ON A STRING
By Brian Greene

...some have argued that if, after decades of research involving thousands of scientists, the theory is still a work in progress, it's time to give up. But to suggest dropping research on the most promising approach to unification because the work has failed to meet an arbitrary timetable for complete success is, well, silly.

I have worked on string theory for more than 20 years because I believe it provides the most powerful framework for constructing the long-sought unified theory. Nonetheless, should an inconsistency be found, or should future studies reveal an insuperable barrier to making contact with experimental data, or should new discoveries reveal a superior approach, I'd change my research focus, and I have little doubt that most string theorists would too.

But this hasn't happened.

[...more]




October 25, 2006

War & Peace
By Michael Shermer

Was Darwin's approach to science and religion healthy and logical? To answer that question I devised a threetiered model on the relationship of science and religion.

1. CONFLICTING-WORLDS MODEL. This "warfare" model holds that science and religion are mutually exclusive ways of knowing, where one is right and the other is wrong. In this model, the findings of modern science are always a potential threat to one's faith and thus they must be carefully vetted against religious truths before acceptance; likewise, the tenets of religion are always a potential threat to science and thus they must be viewed skeptically.

2. SAME-WORLDS MODEL. More conciliatory in its nature, this position holds that science and religion are two ways of examining the same reality; as science progresses to a deeper understanding of the natural world it will reveal that many ancient religious tenets are true.

3. SEPARATE-WORLDS MODEL. On this tier science and religion are neither in conflict nor in agreement. Today it is the job of science to explain the natural world, making obsolete ancient religious sagas of origins and creation. Yet, religion thrives because it still serves a useful purpose as an institution for social cohesiveness and as a guide to finding personal meaning and spirituality.

[...continue]



October 18, 2006
PAGE ONE


Entrepreneur Puts Himself Up for Study In Genetic 'Tell-All'

Dr. Venter Wants to Be First To Have His DNA Mapped; Risk of Blindness Revealed


By ANTONIO REGALADO
Page A1

J. Craig Venter, a biologist and brash entrepreneur, started a recent day with a bowl of oatmeal and skim milk. Since he is genetically predisposed to heart disease, he added "just a little" brown sugar. By the end of the day, Dr. Venter was informed he's got a gene that quadruples his risk of going blind.

Life can be that way when you study your own DNA.

Dr. Venter, 60 years old, is best known for his role in the scientific fight to be the first to decipher the full sequence of the human genome, the billions of DNA letters, or chemical building blocks, that make up the average human's genetic code. In the late 1990s, he headed a private company, Celera Genomics, which tried to finish the task before the Human Genome Project, a public-sector effort paid for by the U.S. government and others. Both sides reached a negotiated "tie" announced by the White House in 2000.

After Dr. Venter was ousted by Celera, in a dispute over business strategy, he revealed a big secret. More than half the DNA decoded by Celera was his own. Now he heads up his own scientific center, the nonprofit J. Craig Venter Institute in Rockville, Md. One major activity over recent months: completing the decoding of Dr. Venter's genome.

[...subscription]



October 9, 2006

I’m a Celebrity, Get My Sequence!
By Kevin Davies

COMMENTARY | Two years ago, the X Prize Foundation awarded $10 million for the first sub-orbital spaceflight. Now, a new prize -- the Archon X Prize for Genomics -- has been formally established. The foundation will award a cool $10 million to the person or team that cracks the much-hyped “$1,000 Genome” threshold for affordable, personal DNA sequencing.

The prize, underwritten a multi-million dollar donation by Archon Minerals president Stewart Blusson, will go to the team that sequences the genomes of 100 people in 10 days, although unresolved for now is how complete those sequences should be. Will the bar be set at 90 percent, 99 percent, 99.9 percent, or what?...

...X Prize Foundation chairman and CEO Peter Diamandis said at a press conference that he wanted to make DNA relevant to people by finding "celebrities and leaders of industry willing to do this." Does this mean we can now expect a crush of celebrities lobbying to join the genome list? After all, who could resist the lure of their own personal genome, the ultimate 21st-century fashion accessory? Paris Hilton or Tom Cruise? David Beckham or Terrell Owens?

Diamandis says reassuringly that additional members of the Genome 100 will also include “ordinary people” – presumably he means paupers lacking multi-million dollar bank accounts -- with some chosen by medical charities such as the March of Dimes. They will join a select club of sequenced human genomes headed by Craig Venter, the former Celera chief who donated his own DNA during the initial genome assembly six years ago, and James Watson, who is having his DNA unraveled by 454.

Celebrity sequencing will attract a lot of publicity for the X Prize, but it risks trivializing the significance of genomic medicine. In only the rarest cases – such as certain forms of heart disease or cancer – will trawling through an individual sequence pinpoint flaws that underlie specific medical manifestations. The implications of personal genomics require a lot more public debate than they’ve been given so far.

There is one silver lining in the Genome 100 however – the urgency that it will lend the cause of genetic privacy. At Harvard Medical School, George Church has taken great lengths to protect the anonymity of subjects volunteering for his personal genome project. By contrast, if Larry King finds his health insurance premiums soaring if any glitches in his sequence become apparent, he might have something to say about it. The latest effort to ban genetic discrimination passed the Senate unanimously, but remains tied up in the House of Representatives.

Celebrities have had a powerful influence in the halls of Congress in raising awareness of medical concerns such as breast cancer, AIDS, and stem cell research. Maybe the Genome 100 gimmick is just what proponents of genetic non-discrimination needed.

[...continued]



Oct 17, 2006

Stephen Colbert Interviews Richard Dawkins
Richard Dawkins, author of The God Delusion, argues that there is no God. He'll have an eternity in hell to prove it.

[...continue]



October 14,2006

Entangled in the Matrix Net
DOROTHY WOODEND

YouTube is a conspiracy theorist's dream, as the number of clips that claim the collapse of the World Trade Center was a setup attest to. This democratization continues on Google Video (soon to swallow YouTube whole and complete its domination), which offers a number of feature documentaries including one called The Net by German filmmaker Lutz Dammbeck. The Net recently screened at the Vancouver International Film Festival, but you can watch it free on the Web as many times as you would like.

This documentary explores the curious relationship between the development of the Internet and Ted Kaczynski (a.k.a. the Unabomber).

Mr. Dammbeck interviews several influential people, including John Brockman and Stewart Brand (old hippies turned founding members of the digerati); Robert Taylor, who helped to initiate the Arapanet (the precursor to the Internet); and the 90-year-old father of cybernetics, Heinz von Foerster, who offers up a few wry observations about the nature of reality itself.

Along the way, there are also traipses through Kurt Gödel's Incompleteness Theorem, the Macy Conferences, Theodor Adorno's Authoritarian Personality, the connection between the Massachusetts Institute of Technology and the military, Norbert Wiener and cybernetics, Henry A. Murray and the LSD experiments at Harvard and crazy old Mr. Kaczynksi with his terror of mind control and supercomputers.

Are you lost yet? I've watched the film a few times, and I'm still not quite sure what it all means, or if it means anything at all. Like the Internet itself, the bewildering density of information requires careful sorting.

But one idea does jump out. John Brockman paraphrases a quote from Doubt and Certainty in Science: A Biologist's Reflections on the Brain by J.Z. Young that states: "We create tools and then we mould ourselves through our use of them."

In the brave new world of Google Video, YouTube, MySpace, et al., what does this mean? If we create technology and then become what we have created, have we now succeeded in making Jackass World?...

...So, are you being controlled by an elite group of cyber-hippies and ex-CIA military types without even knowing it? Or, as Theodor Adorno believed, lulled into a state of passivity and pseudo-individualization by pop culture. Or are you part of what Marshall McLuhan heralded as the new dawn in which "we have extended our central nervous system itself in a global embrace, abolishing both space and time as far as our planet is concerned."

[Ed. Note: See the trailer]

[...continue]



October 14, 2006

What I want for Christmas is...an anti-religion rant
By Ruth Gledhill, Religion Correspondent

A BOOK that rejects religion and argues for the non- existence of God is heading to be the No 1 bestseller for Christmas.

Richard Dawkins's The God Delusion is at the top of the bestseller chart of the online bookseller Amazon, and is climbing up The Times bestseller chart.

With Professor Dawkins about to travel to the US to publicise the book, sources in online sales say that his atheistic rant against all things religious is already trumping celebrity biographies and could take the top slot at the festival that celebrates the birth of the founder of Christianity.

Transworld, its publisher, has had to run several reprints since the book was published just over two weeks ago. More than 100,000 copies have now been printed, making it the year's top-selling science book.

An Oxford science professor, Dawkins, author of The Selfish Gene, uses The God Delusion to mount a bitter attack on religion in all its incarnations.

He argues that monotheism and polytheism are equally absurd and attempts to knock down the 13th-century "proofs" for the existence of God drawn up by Thomas Aquinas.

He attacks more modern concepts such as the "God of the gaps", condemns Creationism and blames religion itself rather than religious extremism for manifestations of fundamentalism, such as suicide bombers in Islam.

In the book he writes: "Some people have views of God that are so broad and flexible that it is inevitable that they will find God wherever they look for him. One hears it said that 'God is the ultimate' or 'God is our better nature' or 'God is the universe'.

"Of course, like any other word, the word 'God' can be given any meaning we like. If you want to say that 'God is energy', then you can find God in a lump of coal."

Rival science author Stephen Jones, Professor of Genetics at University College London, whose latest book The Single Helix is due to be published soon, said: "The polls tell us there could be 20 million Creationists in Britain.

"Twenty million people will not need a yule log this Christmas, they will be able to burn Dawkins's book instead. Personally, I do not care if they burn my own books, as long as they buy them first." ...

[...continue]



Just as science—that is, reliable methods for obtaining knowledge—has encroached on areas formerly considered to belong to the humanities (such as psychology), science is also encroaching on the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is the most accurate way of gaining knowledge about anything, whether it is the human spirit, the role of great men in history, or the structure of DNA. Humanities scholars and historians who spurn it condemn themselves to second-rate status and produce unreliable results.

THE EXPANDING THIRD CULTURE

By John Brockman

JOHN BROCKMAN is publisher and editor of Edge.

John Brockman's Edge Bio Page


THE EXPANDING THIRD CULTURE

Why does society benefit from an accurate representation of knowledge?

Many people, even many scientists, have a narrow view of science as controlled, replicated experiments performed in the laboratory—and as consisting quintessentially of physics, chemistry, and molecular biology.

The essence of science is conveyed by its Latin etymology: scientia, meaning knowledge. The scientific method is simply that body of practices best suited for obtaining reliable knowledge. The practices vary among fields: the controlled laboratory experiment is possible in molecular biology, physics, and chemistry, but it is either impossible, immoral, or illegal in many other fields customarily considered sciences, including all of the historical sciences: astronomy, epidemiology, evolutionary biology, most of the earth sciences, and paleontology. If the scientific method can be defined as those practices best suited for obtaining knowledge in a particular field, then science itself is simply the body of knowledge obtained by those practices.

Just as science—that is, reliable methods for obtaining knowledge—has encroached on areas formerly considered to belong to the humanities (such as psychology), science is also encroaching on the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is the most accurate way of gaining knowledge about anything, whether it is the human spirit, the role of great men in history, or the structure of DNA. Humanities scholars and historians who spurn it condemn themselves to second-rate status and produce unreliable results.

But this doesn't have to be the case. As I wrote in 1991 ("The Emerging Third Culture"):

The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.

There are encouraging signs that the third culture now includes scholars in the humanities who think the way scientists do. They believe that there is a real world and that their job is to understand it and explain it. They test their ideas in terms of logical coherence, explanatory power, and conformity with empirical facts. They do not defer to intellectual authorities: Anyone's ideas can be challenged, and understanding progresses and knowledge accumulates through such challenges. They are not reducing the humanities to biological and physical principles, but they do believe that art, literature, history, politics—a whole panoply of humanist concerns—need to take the sciences into account.

Connections do exist: our arts, our philosophies, our literature are the product of human minds interacting with one another, and the human mind is a product of the human brain, which is organized in part by the human genome and evolved by the physical processes of evolution. Like scientists, the science-based humanities scholars are intellectually eclectic, seeking ideas from a variety of sources and adopting the ones that prove their worth, rather than working within "systems" or "schools."

As such they are not Marxist scholars, or Freudian scholars, or Catholic scholars. They think like scientists, know science, and easily communicate with scientists; their principal difference from scientists is in the subject matter they write about, not their intellectual style. Science and science-based thinking among enlightened humanities scholars are now part of public culture.

And this is not a one-way street. Just as the science-based humanities scholars are learning from, and are influenced by science, scientists are gaining a broader understanding about the import of their own work through interactions with artists.

Something radically new is in the air: new ways of understanding physical systems, new ways of thinking about thinking that call into question many of our basic assumptions. A realistic biology of the mind, advances in physics, electricity, genetics, neurobiology, engineering, the chemistry of materials—all are challenging basic assumptions of who and what we are, of what it means to be human.

But evidently this information hasn't caught up to the editors at our most highly regarded newspapers and magazines. Rather than trusting scientists to review books by scientists, the best and the brightest at the elite publications often turn to literary critics. Confronted with ideas that that upend the Freud, Marx, and modernism default, they pussyfoot around the challenge and the responsibility of presenting the public with an accurate representation of knowledge. Why learn about the human genome when you've already read Virginia Woolf? Why present informed articles and reviews to your readers when you can play the "isms" game, in which you can avoid intelligent discourse by the mere mention of useless terms such as "scientism" and "evolutionism".

Not all intellectuals are of this frame of mind. One distinguished European novelist, who is also a publisher of literary novels and books by eminent scientists, threw up his hands as he exclaimed, "They don't know, they just don't know." To which might be added that a blissful state of ignorance is considered a credential in this world. Why else would reputable publications allow reviewers, ignorant in the sciences, to write about books by scientists?

What can we do about this situation? We can start by asking a question. In 1971, the artist James Lee Byars presented a conceptual piece entitled "The World Question Center", in which he suggested that to arrive at an axiology of the world's knowledge, it was not necessary to read the six million volumes in Harvard's Widener Library. His approach was to seek out the most complex and sophisticated minds, put them in a room together, and have them ask each other the questions they are asking themselves.

Here is my question, the question I am asking myself, a question we can ask each other:  

Why does society benefit from an accurate representation of knowledge?


Either Jesus had a father or he didn't. The question is a scientific one, and scientific evidence, if any were available, would be used to settle it. The same is true of any miracle — and the deliberate and intentional creation of the universe would have to have been the mother and father of all miracles. Either it happened or it didn't. It is a fact, one way or the other, and in our state of uncertainty we can put a probability on it — an estimate that may change as more information comes in. Humanity's best estimate of the probability of divine creation dropped steeply in 1859 when The Origin of Species was published, and it has declined steadily during the subsequent decades, as evolution consolidated itself from plausible theory in the nineteenth century to established fact today.

The Chamberlain tactic of snuggling up to 'sensible' religion, in order to present a united front against ('intelligent design') creationists, is fine if your central concern is the battle for evolution. That is a valid central concern, and I salute those who press it, such as Eugenie Scott in Evolution versus Creationism. But if you are concerned with the stupendous scientific question of whether the universe was created by a supernatural intelligence or not, the lines are drawn completely differently. On this larger issue, fundamentalists are united with 'moderate' religion on one side, and I find myself on the other.

WHY THERE ALMOST CERTAINLY IS NO GOD
By Richard Dawkins

RICHARD DAWKINS, an evolutionary biologist, is the Charles Simonyi Professor of the Public Understanding of Science at Oxford University. He is a Fellow of the Royal Society, and the author of nine books, including The Selfish Gene, The Blind Watchmaker and The Ancestor's Tale and the recently published The God Delusion. To coincide with publication of the book, his Foundation for Reason and Science has also launched a website (RichardDawkins.Net).

Richard Dawkins's Edge Bio Page


WHY THERE ALMOST CERTAINLY IS NO GOD

America, founded in secularism as a beacon of eighteenth century enlightenment, is becoming the victim of religious politics, a circumstance that would have horrified the Founding Fathers. The political ascendancy today values embryonic cells over adult people. It obsesses about gay marriage, ahead of genuinely important issues that actually make a difference to the world. It gains crucial electoral support from a religious constituency whose grip on reality is so tenuous that they expect to be 'raptured' up to heaven, leaving their clothes as empty as their minds. More extreme specimens actually long for a world war, which they identify as the 'Armageddon' that is to presage the Second Coming. Sam Harris, in his new short book, Letter to a Christian Nation, hits the bull's-eye as usual:

It is, therefore, not an exaggeration to say that if the city of New York were suddenly replaced by a ball of fire, some significant percentage of the American population would see a silver-lining in the subsequent mushroom cloud, as it would suggest to them that the best thing that is ever going to happen was about to happen: the return of Christ . . .Imagine the consequences if any significant component of the U.S. government actually believed that the world was about to end and that its ending would be glorious. The fact that nearly half of the American population apparently believes this, purely on the basis of religious dogma, should be considered a moral and intellectual emergency.

Does Bush check the Rapture Index daily, as Reagan did his stars? We don't know, but would anyone be surprised?

My scientific colleagues have additional reasons to declare emergency. Ignorant and absolutist attacks on stem cell research are just the tip of an iceberg. What we have here is nothing less than a global assault on rationality, and the Enlightenment values that inspired the founding of this first and greatest of secular republics. Science education — and hence the whole future of science in this country — is under threat. Temporarily beaten back in a Pennsylvania court, the 'breathtaking inanity' (Judge John Jones's immortal phrase) of 'intelligent design' continually flares up in local bush-fires. Dowsing them is a time-consuming but important responsibility, and scientists are finally being jolted out of their complacency. For years they quietly got on with their science, lamentably underestimating the creationists who, being neither competent nor interested in science, attended to the serious political business of subverting local school boards. Scientists, and intellectuals generally, are now waking up to the threat from the American Taliban.

Scientists divide into two schools of thought over the best tactics with which to face the threat. The Neville Chamberlain 'appeasement' school focuses on the battle for evolution. Consequently, its members identify fundamentalism as the enemy, and they bend over backwards to appease 'moderate' or 'sensible' religion (not a difficult task, for bishops and theologians despise fundamentalists as much as scientists do). Scientists of the Winston Churchill school, by contrast, see the fight for evolution as only one battle in a larger war: a looming war between supernaturalism on the one side and rationality on the other. For them, bishops and theologians belong with creationists in the supernatural camp, and are not to be appeased.

The Chamberlain school accuses Churchillians of rocking the boat to the point of muddying the waters. The philosopher of science Michael Ruse wrote:

We who love science must realize that the enemy of our enemies is our friend. Too often evolutionists spend time insulting would-be allies. This is especially true of secular evolutionists. Atheists spend more time running down sympathetic Christians than they do countering creationists. When John Paul II wrote a letter endorsing Darwinism, Richard Dawkins's response was simply that the pope was a hypocrite, that he could not be genuine about science and that Dawkins himself simply preferred an honest fundamentalist.

A recent article in the New York Times by Cornelia Dean quotes the astronomer Owen Gingerich as saying that, by simultaneously advocating evolution and atheism, 'Dr Dawkins "probably single-handedly makes more converts to intelligent design than any of the leading intelligent design theorists".' This is not the first, not the second, not even the third time this plonkingly witless point has been made (and more than one reply has aptly cited Uncle Remus: "Oh please please Brer Fox, don't throw me in that awful briar patch").

Chamberlainites are apt to quote the late Stephen Jay Gould's 'NOMA' — 'non-overlapping magisteria'. Gould claimed that science and true religion never come into conflict because they exist in completely separate dimensions of discourse:

To say it for all my colleagues and for the umpteenth millionth time (from college bull sessions to learned treatises): science simply cannot (by its legitimate methods) adjudicate the issue of God's possible superintendence of nature. We neither affirm nor deny it; we simply can't comment on it as scientists.

This sounds terrific, right up until you give it a moment's thought. You then realize that the presence of a creative deity in the universe is clearly a scientific hypothesis. Indeed, it is hard to imagine a more momentous hypothesis in all of science. A universe with a god would be a completely different kind of universe from one without, and it would be a scientific difference. God could clinch the matter in his favour at any moment by staging a spectacular demonstration of his powers, one that would satisfy the exacting standards of science. Even the infamous Templeton Foundation recognized that God is a scientific hypothesis — by funding double-blind trials to test whether remote prayer would speed the recovery of heart patients. It didn't, of course, although a control group who knew they had been prayed for tended to get worse (how about a class action suit against the Templeton Foundation?) Despite such well-financed efforts, no evidence for God's existence has yet appeared.

To see the disingenuous hypocrisy of religious people who embrace NOMA, imagine that forensic archeologists, by some unlikely set of circumstances, discovered DNA evidence demonstrating that Jesus was born of a virgin mother and had no father. If NOMA enthusiasts were sincere, they should dismiss the archeologists' DNA out of hand: "Irrelevant. Scientific evidence has no bearing on theological questions. Wrong magisterium." Does anyone seriously imagine that they would say anything remotely like that? You can bet your boots that not just the fundamentalists but every professor of theology and every bishop in the land would trumpet the archeological evidence to the skies.

Either Jesus had a father or he didn't. The question is a scientific one, and scientific evidence, if any were available, would be used to settle it. The same is true of any miracle — and the deliberate and intentional creation of the universe would have to have been the mother and father of all miracles. Either it happened or it didn't. It is a fact, one way or the other, and in our state of uncertainty we can put a probability on it — an estimate that may change as more information comes in. Humanity's best estimate of the probability of divine creation dropped steeply in 1859 when The Origin of Species was published, and it has declined steadily during the subsequent decades, as evolution consolidated itself from plausible theory in the nineteenth century to established fact today.

The Chamberlain tactic of snuggling up to 'sensible' religion, in order to present a united front against ('intelligent design') creationists, is fine if your central concern is the battle for evolution. That is a valid central concern, and I salute those who press it, such as Eugenie Scott in Evolution versus Creationism. But if you are concerned with the stupendous scientific question of whether the universe was created by a supernatural intelligence or not, the lines are drawn completely differently. On this larger issue, fundamentalists are united with 'moderate' religion on one side, and I find myself on the other.

Of course, this all presupposes that the God we are talking about is a personal intelligence such as Yahweh, Allah, Baal, Wotan, Zeus or Lord Krishna. If, by 'God', you mean love, nature, goodness, the universe, the laws of physics, the spirit of humanity, or Planck's constant, none of the above applies. An American student asked her professor whether he had a view about me. 'Sure,' he replied. 'He's positive science is incompatible with religion, but he waxes ecstatic about nature and the universe. To me, that is ¬religion!' Well, if that's what you choose to mean by religion, fine, that makes me a religious man. But if your God is a being who designs universes, listens to prayers, forgives sins, wreaks miracles, reads your thoughts, cares about your welfare and raises you from the dead, you are unlikely to be satisfied. As the distinguished American physicist Steven Weinberg said, "If you want to say that 'God is energy,' then you can find God in a lump of coal." But don't expect congregations to flock to your church.

When Einstein said 'Did God have a choice in creating the Universe?' he meant 'Could the universe have begun in more than one way?' 'God does not play dice' was Einstein's poetic way of doubting Heisenberg's indeterminacy principle. Einstein was famously irritated when theists misunderstood him to mean a personal God. But what did he expect? The hunger to misunderstand should have been palpable to him. 'Religious' physicists usually turn out to be so only in the Einsteinian sense: they are atheists of a poetic disposition. So am I. But, given the widespread yearning for that great misunderstanding, deliberately to confuse Einsteinian pantheism with supernatural religion is an act of intellectual high treason.

Accepting, then, that the God Hypothesis is a proper scientific hypothesis whose truth or falsehood is hidden from us only by lack of evidence, what should be our best estimate of the probability that God exists, given the evidence now available? Pretty low I think, and here's why.

First, most of the traditional arguments for God's existence, from Aquinas on, are easily demolished. Several of them, such as the First Cause argument, work by setting up an infinite regress which God is wheeled out to terminate. But we are never told why God is magically able to terminate regresses while needing no explanation himself. To be sure, we do need some kind of explanation for the origin of all things. Physicists and cosmologists are hard at work on the problem. But whatever the answer — a random quantum fluctuation or a Hawking/Penrose singularity or whatever we end up calling it — it will be simple. Complex, statistically improbable things, by definition, don't just happen; they demand an explanation in their own right. They are impotent to terminate regresses, in a way that simple things are not. The first cause cannot have been an intelligence — let alone an intelligence that answers prayers and enjoys being worshipped. Intelligent, creative, complex, statistically improbable things come late into the universe, as the product of evolution or some other process of gradual escalation from simple beginnings. They come late into the universe and therefore cannot be responsible for designing it.

Another of Aquinas' efforts, the Argument from Degree, is worth spelling out, for it epitomises the characteristic flabbiness of theological reasoning. We notice degrees of, say, goodness or temperature, and we measure them, Aquinas said, by reference to a maximum:

Now the maximum in any genus is the cause of all in that genus, as fire, which is the maximum of heat, is the cause of all hot things . . . Therefore, there must also be something which is to all beings the cause of their being, goodness, and every other perfection; and this we call God.

That's an argument? You might as well say that people vary in smelliness but we can make the judgment only by reference to a perfect maximum of conceivable smelliness. Therefore there must exist a pre-eminently peerless stinker, and we call him God. Or substitute any dimension of comparison you like, and derive an equivalently fatuous conclusion. That's theology.

The only one of the traditional arguments for God that is widely used today is the teleological argument, sometimes called the Argument from Design although — since the name begs the question of its validity — it should better be called the Argument for Design. It is the familiar 'watchmaker' argument, which is surely one of the most superficially plausible bad arguments ever discovered — and it is rediscovered by just about everybody until they are taught the logical fallacy and Darwin's brilliant alternative.

In the familiar world of human artifacts, complicated things that look designed are designed. To naïve observers, it seems to follow that similarly complicated things in the natural world that look designed — things like eyes and hearts — are designed too. It isn't just an argument by analogy. There is a semblance of statistical reasoning here too — fallacious, but carrying an illusion of plausibility. If you randomly scramble the fragments of an eye or a leg or a heart a million times, you'd be lucky to hit even one combination that could see, walk or pump. This demonstrates that such devices could not have been put together by chance. And of course, no sensible scientist ever said they could. Lamentably, the scientific education of most British and American students omits all mention of Darwinism, and therefore the only alternative to chance that most people can imagine is design.

Even before Darwin's time, the illogicality was glaring: how could it ever have been a good idea to postulate, in explanation for the existence of improbable things, a designer who would have to be even more improbable? The entire argument is a logical non-starter, as David Hume realized before Darwin was born. What Hume didn't know was the supremely elegant alternative to both chance and design that Darwin was to give us. Natural selection is so stunningly powerful and elegant, it not only explains the whole of life, it raises our consciousness and boosts our confidence in science's future ability to explain everything else.

Natural selection is not just an alternative to chance. It is the only ultimate alternative ever suggested. Design is a workable explanation for organized complexity only in the short term. It is not an ultimate explanation, because designers themselves demand an explanation. If, as Francis Crick and Leslie Orgel once playfully speculated, life on this planet was deliberately seeded by a payload of bacteria in the nose cone of a rocket, we still need an explanation for the intelligent aliens who dispatched the rocket. Ultimately they must have evolved by gradual degrees from simpler beginnings. Only evolution, or some kind of gradualistic 'crane' (to use Daniel Dennett's neat term), is capable of terminating the regress. Natural selection is an anti-chance process, which gradually builds up complexity, step by tiny step. The end product of this ratcheting process is an eye, or a heart, or a brain — a device whose improbable complexity is utterly baffling until you spot the gentle ramp that leads up to it.

Whether my conjecture is right that evolution is the only explanation for life in the universe, there is no doubt that it is the explanation for life on this planet. Evolution is a fact, and it is among the more secure facts known to science. But it had to get started somehow. Natural selection cannot work its wonders until certain minimal conditions are in place, of which the most important is an accurate system of replication — DNA, or something that works like DNA.

The origin of life on this planet — which means the origin of the first self-replicating molecule — is hard to study, because it (probably) only happened once, 4 billion years ago and under very different conditions from those with which we are familiar. We may never know how it happened. Unlike the ordinary evolutionary events that followed, it must have been a genuinely very improbable — in the sense of unpredictable — event: too improbable, perhaps, for chemists to reproduce it in the laboratory or even devise a plausible theory for what happened. This weirdly paradoxical conclusion — that a chemical account of the origin of life, in order to be plausible, has to be implausible — would follow if it were the case that life is extremely rare in the universe. And indeed we have never encountered any hint of extraterrestrial life, not even by radio — the circumstance that prompted Enrico Fermi's cry: "Where is everybody?"

Suppose life's origin on a planet took place through a hugely improbable stroke of luck, so improbable that it happens on only one in a billion planets. The National Science Foundation would laugh at any chemist whose proposed research had only a one in a hundred chance of succeeding, let alone one in a billion. Yet, given that there are at least a billion billion planets in the universe, even such absurdly low odds as these will yield life on a billion planets. And — this is where the famous anthropic principle comes in — Earth has to be one of them, because here we are.

If you set out in a spaceship to find the one planet in the galaxy that has life, the odds against your finding it would be so great that the task would be indistinguishable, in practice, from impossible. But if you are alive (as you manifestly are if you are about to step into a spaceship) you needn't bother to go looking for that one planet because, by definition, you are already standing on it. The anthropic principle really is rather elegant. By the way, I don't actually think the origin of life was as improbable as all that. I think the galaxy has plenty of islands of life dotted about, even if the islands are too spaced out for any one to hope for a meeting with any other. My point is only that, given the number of planets in the universe, the origin of life could in theory be as lucky as a blindfolded golfer scoring a hole in one. The beauty of the anthropic principle is that, even in the teeth of such stupefying odds against, it still gives us a perfectly satisfying explanation for life's presence on our own planet.

The anthropic principle is usually applied not to planets but to universes. Physicists have suggested that the laws and constants of physics are too good — as if the universe were set up to favour our eventual evolution. It is as though there were, say, half a dozen dials representing the major constants of physics. Each of the dials could in principle be tuned to any of a wide range of values. Almost all of these knob-twiddlings would yield a universe in which life would be impossible. Some universes would fizzle out within the first picosecond. Others would contain no elements heavier than hydrogen and helium. In yet others, matter would never condense into stars (and you need stars in order to forge the elements of chemistry and hence life). You can estimate the very low odds against the six knobs all just happening to be correctly tuned, and conclude that a divine knob-twiddler must have been at work. But, as we have already seen, that explanation is vacuous because it begs the biggest question of all. The divine knob twiddler would himself have to have been at least as improbable as the settings of his knobs.

Again, the anthropic principle delivers its devastatingly neat solution. Physicists already have reason to suspect that our universe — everything we can see — is only one universe among perhaps billions. Some theorists postulate a multiverse of foam, where the universe we know is just one bubble. Each bubble has its own laws and constants. Our familiar laws of physics are parochial bylaws. Of all the universes in the foam, only a minority has what it takes to generate life. And, with anthropic hindsight, we obviously have to be sitting in a member of that minority, because, well, here we are, aren't we? As physicists have said, it is no accident that we see stars in our sky, for a universe without stars would also lack the chemical elements necessary for life. There may be universes whose skies have no stars: but they also have no inhabitants to notice the lack. Similarly, it is no accident that we see a rich diversity of living species: for an evolutionary process that is capable of yielding a species that can see things and reflect on them cannot help producing lots of other species at the same time. The reflective species must be surrounded by an ecosystem, as it must be surrounded by stars.

The anthropic principle entitles us to postulate a massive dose of luck in accounting for the existence of life on our planet. But there are limits. We are allowed one stroke of luck for the origin of evolution, and perhaps for a couple of other unique events like the origin of the eukaryotic cell and the origin of consciousness. But that's the end of our entitlement to large-scale luck. We emphatically cannot invoke major strokes of luck to account for the illusion of design that glows from each of the billion species of living creature that have ever lived on Earth. The evolution of life is a general and continuing process, producing essentially the same result in all species, however different the details.

Contrary to what is sometimes alleged, evolution is a predictive science. If you pick any hitherto unstudied species and subject it to minute scrutiny, any evolutionist will confidently predict that each individual will be observed to do everything in its power, in the particular way of the species — plant, herbivore, carnivore, nectivore or whatever it is — to survive and propagate the DNA that rides inside it. We won't be around long enough to test the prediction but we can say, with great confidence, that if a comet strikes Earth and wipes out the mammals, a new fauna will rise to fill their shoes, just as the mammals filled those of the dinosaurs 65 million years ago. And the range of parts played by the new cast of life's drama will be similar in broad outline, though not in detail, to the roles played by the mammals, and the dinosaurs before them, and the mammal-like reptiles before the dinosaurs. The same rules are predictably being followed, in millions of species all over the globe, and for hundreds of millions of years. Such a general observation requires an entirely different explanatory principle from the anthropic principle that explains one-off events like the origin of life, or the origin of the universe, by luck. That entirely different principle is natural selection.

We explain our existence by a combination of the anthropic principle and Darwin's principle of natural selection. That combination provides a complete and deeply satisfying explanation for everything that we see and know. Not only is the god hypothesis unnecessary. It is spectacularly unparsimonious. Not only do we need no God to explain the universe and life. God stands out in the universe as the most glaring of all superfluous sore thumbs. We cannot, of course, disprove God, just as we can't disprove Thor, fairies, leprechauns and the Flying Spaghetti Monster. But, like those other fantasies that we can't disprove, we can say that God is very very improbable.

[First published by Huffington Post, October 23, 2006.]


...some have argued that if, after decades of research involving thousands of scientists, the theory is still a work in progress, it's time to give up. But to suggest dropping research on the most promising approach to unification because the work has failed to meet an arbitrary timetable for complete success is, well, silly.

I have worked on string theory for more than 20 years because I believe it provides the most powerful framework for constructing the long-sought unified theory. Nonetheless, should an inconsistency be found, or should future studies reveal an insuperable barrier to making contact with experimental data, or should new discoveries reveal a superior approach, I'd change my research focus, and I have little doubt that most string theorists would too.

But this hasn't happened.

THE UNIVERSE ON A STRING
BY Brian Greene

BRIAN GREENE, a professor of physics and mathematics at Columbia, is the author of The Elegant Universe and The Fabric of the Cosmos.

Brian Greene's Edge Bio Page


THE UNIVERSE ON A STRING

Seventy-five years ago this month, The New York Times reported that Albert Einstein had completed his unified field theory — a theory that promised to stitch all of nature's forces into a single, tightly woven mathematical tapestry. But as had happened before and would happen again, closer scrutiny revealed flaws that sent Einstein back to the drawing board. Nevertheless, Einstein's belief that he'd one day complete the unified theory rarely faltered. Even on his deathbed he scribbled equations in the desperate but fading hope that the theory would finally materialize. It didn't.

In the decades since, the urgency of finding a unified theory has only increased. Scientists have realized that without such a theory, critical questions can't be addressed, such as how the universe began or what lies at the heart of a black hole. These unresolved issues have inspired much progress, with the most recent advances coming from an approach called string theory. Lately, however, string theory has come in for considerable criticism. And so, this is an auspicious moment to reflect on the state of the art.

First, some context. For nearly 300 years, science has been on a path of consolidation. In the 17th century, Isaac Newton discovered laws of motion that apply equally to a planet moving through space and to an apple falling earthward, revealing that the physics of the heavens and the earth are one. Two hundred years later, Michael Faraday and James Clerk Maxwell showed that electric currents produce magnetic fields, and moving magnets can produce electric currents, establishing that these two forces are as united as Midas' touch and gold. And in the 20th century, Einstein's work proved that space, time and gravity are so entwined that you can't speak sensibly about one without the others.

This striking pattern of convergence, linking concepts once thought unrelated, inspired Einstein to dream of the next and possibly final move: merging gravity and electromagnetism into a single, overarching theory of nature's forces.

In hindsight, there was almost no way he could have succeeded. He was barely aware that there were two other forces he was neglecting — the strong and weak forces acting within atomic nuclei. Furthermore, he willfully ignored quantum mechanics, the new theory of the microworld that was receiving voluminous experimental support, but whose probabilistic framework struck him as deeply misguided. Einstein stayed the course, but by his final years he had drifted to the fringe of a subject he had once dominated.

After Einstein's death, the torch of unification passed to other hands. In the 1960's, the Nobel Prize-winning works of Sheldon Glashow, Abdus Salam and Steven Weinberg revealed that at high energies, the electromagnetic and weak nuclear forces seamlessly combine, much as heating a cold vat of chicken soup causes the floating layer of fat to combine with the liquid below, yielding a homogeneous broth. Subsequent work argued that at even higher energies the strong nuclear force would also meld into the soup, a proposed consolidation that has yet to be confirmed experimentally, but which has convinced many physicists that there is no fundamental obstacle to unifying three of nature's four forces.

For decades, however, the force of gravity stubbornly resisted joining the fold. The problem was the very one that so troubled Einstein: the disjunction between his own general relativity, most relevant for extremely massive objects like stars and galaxies, and quantum mechanics, the framework invoked by physics to deal with exceptionally small objects like molecules and atoms and their constituents.

Time and again, attempts to merge the two theories resulted in ill-defined mathematics, much like what happens on a calculator if you try to divide one by zero. The display will flash an error message, reprimanding you for misusing mathematics. The combined equations of general relativity and quantum mechanics yield similar problems. While the conflict rears its head only in environments that are both extremely massive and exceptionally tiny — black holes and the Big Bang being two primary examples — it tells of a fissure in the very foundations of physics.

Such was the case until the mid-1980's, when a new approach, string theory, burst onto the stage. Difficult and complex calculations by the physicists John Schwarz and Michael Green, who had been toiling for years in scientific obscurity, gave compelling evidence that this new approach not only unified gravity and quantum mechanics, as well as nature's other forces, but did so while sweeping aside previous mathematical problems. As word of the breakthrough spread, many physicists dropped what they were working on and joined a global effort to realize Einstein's unified vision of the cosmos.

String theory offers a new perspective on matter's fundamental constituents. Once viewed as point-like dots of virtually no size, particles in string theory are minuscule, vibrating, string-like filaments. And much as different vibrations of a violin string produce different musical notes, different vibrations of the theory's strings produce different kinds of particles. An electron is a tiny string vibrating in one pattern, a quark is a string vibrating in a different pattern. Particles like the photon that convey nature's forces in the quantum realm are strings vibrating in yet other patterns.

Crucially, the early pioneers of string theory realized that one such vibration would produce the gravitational force, demonstrating that string theory embraces both gravity and quantum mechanics. In sharp contrast to previous proposals that cobbled gravity and quantum mechanics uneasily together, their unity here emerges from the theory itself.

While accessibility demands that I describe these developments using familiar words, beneath them lies a bedrock of rigorous analysis. We now have more than 20 years of painstaking research, filling tens of thousands of published pages of calculations, which attest to string theory's deep mathematical coherence. These calculations have given the theory countless opportunities to suffer the fate of previous proposals, but the fact is that every calculation that has ever been completed within string theory is free from mathematical contradictions.

Moreover, these works have also shown that many of the prized breakthroughs in fundamental physics, discovered over the past two centuries through arduous research using a wide range of approaches, can be found within string theory. It's as if one composer, working in isolation, produced the greatest hits of Beethoven, Count Basie and the Beatles. When you also consider that string theory has opened new areas of mathematical research, you can easily understand why it's captured the attention of so many leading scientists and mathematicians.

Nevertheless, mathematical rigor and elegance are not sufficient to demonstrate a theory's relevance. To be judged a correct description of the universe, a theory must make predictions that are confirmed by experiment. And as a small but vocal group of critics of string theory justly emphasize, string theory has yet to do so. This is a key point, so it's worth serious scrutiny.

We understand string theory much better now than we did 20 years ago. We've developed powerful techniques of mathematical analysis that have improved the accuracy of its calculations and provided invaluable insights into the theory's logical structure. Even so, researchers worldwide are still working toward an exact and tractable formulation of the theory's equations. And without that final formulation in hand, the kind of detailed, definitive predictions that would subject the theory to comprehensive experimental vetting remain beyond our reach.

There are, however, features of the theory that may be open to examination even with our incomplete understanding. We may be able to test the theory's predictions of particular new particle species, of dimensions of space beyond the three we can directly see, and even its prediction that microscopic black holes may be produced through highly energetic particle collisions. Without the exact equations, our ability to describe these attributes with precision is limited, but the theory gives enough direction for the Large Hadron Collider, a gigantic particle accelerator now being built in Geneva and scheduled to begin full operation in 2008, to search for supporting evidence by the end of the decade.

Research has also revealed a possibility that signatures of string theory are imprinted in the radiation left over from the Big Bang, as well as in gravitational waves rippling through space-time's fabric. In the coming years, a variety of experiments will seek such evidence with unprecedented observational fidelity. And in a recent, particularly intriguing development, data now emerging from the Relativistic Heavy Ion Collider at the Brookhaven National Laboratory appear to be more accurately described using string theory methods than with more traditional approaches.

To be sure, no one successful experiment would establish that string theory is right, but neither would the failure of all such experiments prove the theory wrong. If the accelerator experiments fail to turn up anything, it could be that we need more powerful machines; if the astronomical observations fail to turn up anything, it could mean the effects are too small to be seen. The bottom line is that it's hard to test a theory that not only taxes the capacity of today's technology, but is also still very much under development.

Some critics have taken this lack of definitive predictions to mean that string theory is a protean concept whose advocates seek to step outside the established scientific method. Nothing could be further from the truth. Certainly, we are feeling our way through a complex mathematical terrain, and no doubt have much ground yet to cover. But we will hold string theory to the usual scientific standard: to be accepted, it must make predictions that are verified.

Other detractors have seized on recent work suggesting that one of string theory's goals beyond unification of the forces — to provide an explanation for the values of nature's constants, like the mass of the electron and the strength of gravity — may be unreachable (because the theory may be compatible with those constants having a range of values). But even if this were to prove true, realizing Einstein's unified vision would surely be prize enough.

Finally, some have argued that if, after decades of research involving thousands of scientists, the theory is still a work in progress, it's time to give up. But to suggest dropping research on the most promising approach to unification because the work has failed to meet an arbitrary timetable for complete success is, well, silly.

I have worked on string theory for more than 20 years because I believe it provides the most powerful framework for constructing the long-sought unified theory. Nonetheless, should an inconsistency be found, or should future studies reveal an insuperable barrier to making contact with experimental data, or should new discoveries reveal a superior approach, I'd change my research focus, and I have little doubt that most string theorists would too.

But this hasn't happened.

String theory continues to offer profound breadth and enormous potential. It has the capacity to complete the Einsteinian revolution and could very well be the concluding chapter in our species' age-old quest to understand the deepest workings of the cosmos.

Will we ever reach that goal? I don't know. But that's both the wonder and the angst of a life in science. Exploring the unknown requires tolerating uncertainty.

[First published as an OpEd piece in The New York Times, October 20, 2006).

|Top|