Edge 230 — December 6, 2007
(11,130 words)


J. Craig Venter

Lawrence M. Krauss


Jonathan Haidt
replies to David Sloan Wilson, Michael Shermer, Sam Harris, PZ Myers, Marc D. Hauser

Alan Sokal
on "Taking Science On Faith" By Paul Davies

Paul Davies
replies to Jerry Coyne, Nathan Myhrvold, Lawrence Krauss, Scott Atran,
Sean Carroll, Jeremy Bernstein, PZ Myers, Lee Smolin, John Horgan, Alan Sokal


Stocking-fillers: A seasonal run on the ideas bank

In this lecture I will argue that the future of life depends not only in our ability to understand and use DNA, but also, perhaps in creating new synthetic life forms, that is, life which is forged not by Darwinian evolution but created by human intelligence.

To some this may be troubling, but part of the problem we face with scientific advancement, is the fear of the unknown - fear that often leads to rejection. 

Science is a topic which can cause people to turn off their brains.  I contend that science has failed to excite more people for at least two reasons: it is frequently taught poorly, often as rote memorization of complex facts and data, and it is antithetical to our visceral-driven way we live and interact with our world.

The 32nd Richard Dimbleby Lecture
Delivered by J. Craig Venter, BBC One, December 4, 2007

One of the principal scientists who decoded the human genome is about to create the first artificial life form on Earth. So what does the future hold in A DNA-Driven World?

J. CRAIG VENTER, a geneticist, is Founder and President of the J. Craig Venter Institute and the J. Craig Venter Science Foundation. He is the author of A Life Decoded.

J. Craig Venter's Edge Bio Page


Thank you for the kind introduction. It is a great honor to be presenting the 2007 Dimbleby Lecture as only the third American, and one of just a handful of scientists out of the 32 Dimbleby Lectures.

I have called this lecture 'A DNA-Driven World', because I believe that the future of our society relies at least in part on our understanding of biology and the molecules of life - DNA.  Every era is defined by its technologies.  The last century could be termed the nuclear age, and I propose that the century ahead will be fundamentally shaped by advances in biology and my field of genomics, which is the study of the complete genetic make-up of a species.

Our planet is facing almost insurmountable problems, problems that governments on their own clearly can't fix.  In order to survive, we need a scientifically literate society willing and able to embrace change - because our ability to provide life's essentials of food, water, shelter and energy for an expanding human population will require major advances in science and technology. 

In this lecture I will argue that the future of life depends not only in our ability to understand and use DNA, but also, perhaps in creating new synthetic life forms, that is, life which is forged not by Darwinian evolution but created by human intelligence.

To some this may be troubling, but part of the problem we face with scientific advancement, is the fear of the unknown - fear that often leads to rejection. 

Science is a topic which can cause people to turn off their brains.  I contend that science has failed to excite more people for at least two reasons: it is frequently taught poorly, often as rote memorization of complex facts and data, and it is antithetical to our visceral-driven way we live and interact with our world. 

As a young student I was very turned off by the forced memorization of seemingly trivial facts which were, I felt, at the expense of true understanding. Instead I was much more interested in discovering and living in my world - I caught frogs and snakes, built boats and explored my surroundings.

In the past, science and the world used to seem easier to understand when discovery was based directly on our human senses.  For example, when Darwin visited the Galapagos on his epic voyage he was able to see with his own eyes the flightless cormorants, the giant tortoises and the swimming and diving iguanas. From this sensory experience, he was able then to relate what he saw in the Galapagos to his other observations and develop a new context for understanding life by proposing the theory of evolution.

When Galileo developed the telescope, the wonders of the skies were truly illuminated for humans by expanding the capabilities of our visual system. Scientists have continued to extend our vision to glimpse distance galaxies that are not even faint stars in the sky to our naked eyes.  Microscopes have helped us see further into the inner world of biology, first to cells then to molecules, all advances taking us well beyond our own physiological capabilities.

Our ability to see, hear, smell, taste, and feel the world around us are wonderful evolutionary developments upon which we base our daily lives.  We can recognize and respond to the minor facial differences in how the 6.5 billion of us on Earth appear, but also to minute changes in facial expression indicating astonishment, pleasure, fear, love, and hate. We devote a substantial amount of modern human existence and our economy, appealing to our love of visual and audio stimulation.

In addition to our obvious senses, we have other remarkable capabilities that most of us are not aware of, but affect our lives from minute to minute. For example, while we cannot see, taste or feel carbon dioxide, we are extraordinarily sensitive to minute changes of CO2 concentrations in our bodies.  It is carbon dioxide not oxygen that controls our breathing. 

But as science has advanced, it has gone far beyond the immediately sensed world. It is now a world filled with dark matter in space, x-rays, gamma-rays, ultra violet light, DNA, genes, chromosomes, and bacteria that live in and around us in staggering numbers. We can't detect these directly, yet we feel the consequences of all of them.  We are also now bombarded by information on wars, acts of terror, climate change and global warming, devastating storms, fuel shortages,  emerging infections, flu pandemics, HIV, stem cells, animal cloning, genetically modified plants, and now the possibility of synthetic life forms, all while trying to cope with complexities of our daily lives.  It is no great surprise then that there is a global resurgence of fundamentalism, a desire to get back to what appeared to be a simpler time, and a time when our primary senses and simple rules appeared to determine our life outcomes.

But I believe such a view is both simplistic and dangerous because it avoids the issues we need to face.

Our planet is in crisis, and we need to mobilize all of our intellectual forces to save it. One solution could lie in building a scientifically literate society in order to survive.

While we share most of our senses with the rest of the animal world, we have a most unique and exciting evolutionary development--our brain. It provides us the ability to think, to reason, to predict and ponder the future. It enables us to ask questions and gives us the extraordinary capability to take over our own evolution by building complex tools that  extend human capabilities millions of times further than would happen even with another billion years of evolution.

To begin the process of change we need to start with our children by teaching them in place of memorization, to explore, challenge, and problem solve in an attempt to understand the world around them, and most especially the world they cannot "see" or feel directly. Perhaps, we can also start by changing the way we teach science in our schools.

Many studies continue to relay sobering facts about the state of our science and math education in both the United States and the United Kingdom. A recent study compared math and science scores of 12 to 13 year olds from each US state to their counterparts in both the developed and developing world. While it conveyed some good information, namely that the US and UK are doing better than in previous years, it still showed that compared to countries such as Singapore, Taiwan, Japan and China even the best US states and England still lag behind. The good news for England however, is that you've outperformed the US in science scores. This might be due in part to the fact that half of all US citizens believe that humans coexisted with dinosaurs, or the 25%  who don't know the Earth revolves around the sun, and the 58% who cannot calculate a 10% tip on a restaurant bill.  With this poor state of basic knowledge, how can we hope to survive the ever growing complexities of modern life? 

This lack of knowledge is only part of the issue. In the US only 16% of all postsecondary education degrees are in math, science, or engineering, compared to 52% for China. And unfortunately those numbers are not all that different in the UK. If science and engineering are not national and global priorities how can we expect to cope with the complexities ahead and/or compete with nations that do value science?

So what can we do to change this situation?  One solution could be new teaching methods aimed at exciting students about discovery. 

I did not get excited about science until after I was drafted into the military during the Vietnam War and ended up in the medical corps. It was only there in the chaos of war that I learned firsthand that knowledge had real life and death consequences. While I went on to pursue a career in science after serving in Vietnam, I wish that my interest in science had been stimulated much sooner. We now know that to motivate students, particularly girls, for careers in  science we need to capture their attention early.

At the Venter Institute we have developed a mobile genomics laboratory to bring the science of genomics to 12 and 13 year olds to expose these students to scientific problem solving and the excitement of science. We started out with the simple idea of outfitting a large bus as a research laboratory, and then, working with schools, we developed learning modules taught by very enthusiastic and hands on teachers. The results have been overwhelming. While many  were at first skeptical of the program, because it was new and different from the standard lesson plans, we now have a waiting list for participants and we have constant calls and emails from parents and teachers who want the bus to come to their schools. 

I think this program succeeds, because in each lesson plan we convey the wonderment of discovery and problem solving. For example, one lesson involves solving a crime scene investigation using DNA analysis much like is done in a popular TV program CSI.  Had I been exposed to science in this real world manner I might have had a much better educational experience and at an earlier stage forged a stronger interest in science. 

There are also science intensive schools that are trying alternative teaching methods. One such school in Virginia is teaching students to be more like scientists - to use inquiry-based learning and encouraging them to do experiments they designed themselves rather than age-old text book experiments and lessons heavy on memorization. These students are learning what I learned on my own while doing research as an advanced university student : that there is no greater intellectual joy than asking seemingly simple questions about life, then designing an experiment to find answers and uncovering a never before known discovery. We need generations of children who are grounded in reality and who learn evidenced-based decision making as a life-long philosophy.  Teaching science as evidence-based decision making could have a profound impact on the pace of future discoveries and inventions. Simply asking what is the evidence behind any claim is a marked contrast to approaching life only upon a faith-based system.

Fostering such scientific literacy is crucial, because we and our planet are facing problems that, I believe, can only be solved by scientific advancement.

There are those who like to believe that the future of life on Earth will continue as it has in the past, but unfortunately for humanity, the natural world around us does not care what we believe. But believing that we can do something to change our situation using our knowledge can very much affect the environment in which we live.

Perhaps an even greater problem than scientific literacy, is that almost every aspect of our modern society is geared toward only dealing with problems after they have occurred, rather than focusing on prevention.  We have a visceral response to tragedies, to wars, floods, disease, and famine because we can see the problem and see the need to correct it.  A much more difficult approach for societies is to use our intellectual capacity to understand the possibility of preventing wars by not invading countries but using diplomacy, or repairing infrastructure before bridges and dams fail, or preventing diseases by changing our diet.

Medicine and health care are areas that desperately need to move toward a preventive philosophy.  We need to understand that it is far more cost effective, with better life outcomes to prevent diseases rather than treat them after they occur.

The cost of health care is one of the fastest growing expenses.  In 2005 total US health expenditures rose 6.9 percent -- twice the rate of inflation. Total spending was a staggering $2 trillion. US health care spending is expected to increase at similar levels for the next decade reaching $4 trillion in 2015. That's 20 percent of GDP. But all this money does not seem to guarantee the highest quality health care.  The World Health Organization in 2000 ranked the US health care system as 1st by expenditure but only 72nd on health.  In contrast the UK was 26th by total expenditures and 24th on health. 

If we take a look at the cost burden of just one disease, diabetes, the figures are astounding. Diabetes is a disease that when poorly managed leads to serious complications such as heart disease, stroke, blindness, kidney failure, and nerve disease.

According to the US Centers for Disease Control, the total cost of diabetes to US society is $132 billion each year. The average annual health care costs for a person with diabetes, is over five times that of someone without the disease. In the UK it is estimated that 9% of the annual NHS budget or over 5.2 billion pounds goes to diabetes care.  Many studies have shown that simple preventive measures such as a healthier diet and moderate exercise such as walking can lead to dramatic reductions in the rate of disease onset and can eliminate or greatly reduce the incidence of complications.

Preventative medicine is the only way forward that I see for lowering the cost of health care other than the unacceptable approach of denying access.  One of the keys to preventative medicine will be an understanding of our genetic risk for future diseases along with a greater understanding of the corresponding environmental influences of disease.

Just three months ago in September, we published the first complete human genome sequence and now it is available to all on the internet.  The human genome comprises all the genetic information that we inherit from both of our parents in the form of 46 chromosomes, 23 from each parent.  Chromosomes are in turn long stretches of DNA which is composed of four different chemical letters known simply as A, T, C and G.  Our genome has six billion of these genetic letters. The genome we published contained both sets of chromosomes from each of my parents. I say my parents because it was my own genome that was sequenced and published.

I chose to decode my DNA because in the complex debate concerning deterministic views of genetic outcomes and the fears that many have voiced about revealing all their genetic secrets. I as a leader in this field, wanted to show that we don't have to fear our genetic information.  Our genetic code is not deterministic and will provide us very few yes-no answers. It will, however, provide probabilities concerning outcomes that we will eventually be able to influence.  It seemed far better to me to use my own genome, rather than trying to convince anyone else that it was ok for them. 

One of the more exciting findings from our study is that any two humans differ from each other by about 1-2%, not the 0.1% that we thought was the case when we sequenced the first draft of the human genome earlier in the decade.  This data is much more comforting as it is clear to me that we are all much more individualistic than previously thought.  One of the key questions that I frequently get asked is what have I learned from my genome and is there information that I can do something about?

Let me give you a few examples to illustrate some of what I have found.  For example, like many people, I reach for my inhaler in smoggy conditions. Genetics contributes to this susceptibility and researchers have focused on a certain family of enzymes that help detoxify everything from carcinogens to pharmaceuticals.  There is a gene that is associated with the ability to degrade environmental toxins, however nearly half of the Caucasian population lacks that gene. In my own genome I found only one copy that I received from one parent and none from the other, so perhaps that is why I am more susceptible to environmental toxins. 

As a depressing bonus, given its detoxifying role, this genetic deficiency may make me more susceptible to particular chemical carcinogens, and there is an association with lung and colorectal cancers.

From my genome I also became aware of genes that confirmed my increased risk for heart disease.  The most common cause of heart disease is atherosclerosis, in which calcium, along with fats and cholesterol, collects in the blood vessels to form plaques, which can trigger a heart attack or stroke. One gene called APO E is responsible for regulating levels of certain fats in the bloodstream. Variants here have been linked with heart disease and also to Alzheimer's disease. Both of these could be in the cards for me.  Fortunately, by reading my own genome, I have a chance to overcome my genetics by making changes in my diet and exercise. I am also taking a statin, a fat-lowering drug, as part of my preventative medicine paradigm. Statins also shows some hints of prevention of Alzheimer's disease.

Hundreds more genes are linked with coronary disease, from heart attacks to high blood pressure and narrowing of blood vessels. My genome carries lower risk versions of some genes and higher risks versions of others, but it will take time for us to understand the complicated way they interact with each other and how to predict a true risk profile.

However, one genetic change that probably lowers my risk for a heart attack is associated with my body's ability to rapidly metabolize caffeine.  I drink many cups of coffee per day but fortunately, I carry the rapid metabolizing version of the gene. Some genes only become harmful in combination with a certain lifestyle - drinking coffee, tea or other drinks with caffeine.  Some individuals carry a mutation that slows down caffeine metabolism and, as a result, increases an individuals' risk of having a heart attack on drinking tea or coffee. A study of around 4,000 people showed that the risk of heart attack increased 64 percent with four or more cups of coffee per day, compared with patients who drank less than one cup per day. However, the corresponding risk was less than 1 percent for individuals, who like me, had two copies of a rapid metabolizing version of the gene. These genetic differences may explain why many studies looking at the association between caffeine consumption and heart attack risk have been inconclusive, because we are not genetically identical and do not all respond in the same way. 

These are just a handful of illustrations that hint at the type of information that will be possible for all of us in the near future. 

At my institute we are now scaling up to sequence the genomes from 10,000 people. This will provide a massive and powerful database, particularly when linked with clinical records and life outcomes.  At that stage, we will have a much clearer view of the genetic basis of humanity. 

I feel that new laws are needed to prevent an individual's genetic code from being used as a basis of discrimination in education, employment or access to health care.  The genetic code will give us probabilities about disease risk and the ability to understand environmental factors linked to genetics.  Will governments, businesses and insurance companies pay the smaller amount in advance to prevent disease? Or will we be locked into the current system of treating only what we can see?

Being an optimist I believe that we can ultimately solve the health care issue. But the fundamental problem facing our planet - that of climate change -- is one that is far more grave. In fact, unless we tackle this head on, health care could be the least of our worries.

There has been much debate about climate change perhaps because we cannot see carbon dioxide when we exhale, or when we burn oil and coal to heat our homes, or use petrol to power our cars or fly planes. We do, however, have scientific instruments that can accurately measure what we humans produce and the increasing amount of carbon that we are adding to our environment. 

The data is irrefutable--carbon dioxide concentrations have been steadily increasing in our atmosphere as a result of human activity since the earliest measurements began. We know that on the order of 4.1 billion tons of carbon are being added to and staying in our atmosphere each year.  We know that burning fossil fuels and deforestation are the principal contributors to the increasing carbon dioxide concentrations in our atmosphere. We know that increasing CO2 concentrations has the same effect as the glass walls and roof of a greenhouse. It lets the energy from the sun easily penetrate but limits its escape, hence the term greenhouse gas.

Observational and modeling studies have confirmed the association of increasing CO2 concentrations with the change in average global temperatures over the last 120 years.  Between 1906 and 2005 the average global temperature has increased 0.74 degrees C. This may not seem like very much, but it can have profound effects on the strength of storms and the survival of species including coral reefs.

Eleven of the last twelve years rank among the warmest years since 1850.  While no one knows for certain the consequences of this continuing unchecked warming, some have argued it could result in catastrophic changes, such as the disruption of the Gulf Steam which keeps the UK out of the ice age or even the possibility of the Greenland ice sheet sliding into the Atlantic Ocean.  Whether or not these devastating changes occur, we are conducting a dangerous experiment with our planet. One we need to stop.

The developed world including the United States, England and Europe contribute disproportionately to the environmental carbon, but the developing world is rapidly catching up.  As the world population increases from 6.5 billion people to 9 billion over the next 45 years and countries like India and China continue to industrialize, some estimates indicate that we will be adding over 20 billion tons of carbon a year to the atmosphere. Continued greenhouse gas emissions at or above current rates would cause further warming and induce many changes to the global climate that could be more extreme than those observed to date. This means we can expect more climate change; more ice cap melts, rising sea levels, warmer oceans and therefore greater storms, as well as more droughts and floods, all which compromise food and fresh water production.

The increase in population coupled with climate change will tax every aspect of our lives. In a world already struggling to keep up with demand, will we be able to provide the basics of food, clean water, shelter and fuel to these new citizens of Earth? And will governments be able to cope with new emerging infections, storms, wildfires, and global conflicts?

So is there any way of avoiding these apocalyptic visions of the future coming true? Many have argued that we simply need to conserve, to alter and regress our standard of living and block the industrialization of developing countries. In my view this is extremely naive thinking. Furthermore, even the most optimistic models on climate change show a dramatically altered planet Earth going forward even if we embrace all alternative options such as wind and solar energy, and electric cars. Our entire world economy and the ability of modern society to provide life's basics, depend on the very industrialization that contributes to our possible demise. 

Yet, sadly, very little thinking, planning or projections about how to cope with the carbon problem and climate change have taken into account the capabilities of modern science to produce what we have long needed to help solve these global threats.

It is clear to me that we need more approaches and creative solutions. We need new disruptive ideas and technologies to solve these critical global issues.  This is where, I believe, biology and genomics, come in.

Wikipedia defines a disruptive technology or disruptive innovation as "a technological innovation, product, or service that eventually overturns the existing dominant technology or status quo product in the market."  Well known examples of disruptive innovations include: telephones replacing telegraphs, cell phones replacing land lines, automobiles replacing horses and carriages and digital photography over film.  We are clearly in need of a multitude of disruptive inventions to change our approach to energy and the challenges ahead of us. 

Creating new technology is something my team and I have some familiarity with. When we joined the race to sequence the human genome in 1998 we did so with a completely new and relatively untried technique. I was called many things - audacious, arrogant, rebellious, and maverick - but the most flattering would have been disruptive. Few people thought our method would work but we proved them wrong. And within two years the first draft of the human genome  was laid out for all to see.

Since then the field has advanced beyond all expectation.  Utilizing biology we have the ability to address every area of our lives--from medical treatment, to renewable sources of fuels.  Plastics, carpets, clothing, medicines, and motor oil - all of these things can be created by biological organisms, and in an environmentally sustainable manner. 

The pedantic argument concerning future inventions is how can we count on new technologies that don't yet exist?  Some can look at the past and see no change for the future, while others will extrapolate forward in a liner manner.  However, there are some fields where predicting and counting on exponential change has become reasonable and reliable.  For example, Gordon Moore, a founder of the computer chip giant Intel, predicted that the density of transistors on integrated circuits would double every 2 years, a prediction that became referred to as Moore's Law.  This rough rule of exponential change has now been applied to the electronics industry as a whole and specifically to computer memory and digital cameras. There is another version, called Butter's Law of Photonics.  This law predicts that data transmission over optical fibers will double every nine months, and as a result, the cost of transmitting data decreases by half every nine months.  We see the results of these predictions in ever faster, smaller and cheaper computers and faster data transmission which is probably a good thing as digital cameras with small memory cards exceed the capacity of computers on the market just barely a decade ago.

This kind of exponential growth is what has happened with our human population.  It required close to 100,000 years for the human population to reach 1 billion people on Earth in 1804.  In 1960 the world population passed 3 billion and now we are likely to go from 6.5 billion to 9 billion over the next 45 years.  I was born in 1946 when there were only about 2.4 billion of us on the planet, today there are almost three people for each one of us in 1946 and there will soon be four.

If such predictions of exponential change have come true for the electronics industry, and the population, then isn't it possible the same could hold true for changing education, medicine, replacing the petrochemical industry, and saving the environment?

Similar exponential growth is seen in genomics - a term that did not even exist prior to the 1980's.  While the initial discoveries came slowly, they were followed by an ever increasing pace of change.  For example, in 1955 Fred Sanger at Cambridge determined the sequence of the protein insulin. It was the first protein to be sequenced in history. Twenty-one years later in 1976 and 1977 the first two viral genomes were decoded. However, it would be 18 more years in 1995 when my team used disruptive techniques to decode the first genome of a living organism, Haemophilus influenzae, a bacterium that causes ear infections and meningitis in children. This genome has 1.8 million letters of genetic code making it 300 times the size of the first viral genomes. 

Armed with this new method only 5 years later, we increased the scale of what we did by 100 times by determining the first insect genome, the fruit fly, which had 180 million letters of genetic code. We followed this one year later with the 3 billion base pair haploid human genome which was equivalent to over 600,000 viral genomes and over 1600 bacterial genomes. 

So over a short period of time genome projects, which 10 years ago required several years to complete, now take only days.  Within 5 years it will be commonplace to have your own genome sequenced. Something that just a decade ago required billions of pounds and was considered a monumental achievement.  Our ability to read the genetic code is changing even faster than changes predicted by Moore's Law.

Using genomics has also rapidly accelerated the discovery of new species.  Earlier this year from my institute's Sorcerer II Expedition, which included a sailing circumnavigation on my 95 foot yacht, Sorcerer II, we applied the tools we developed for decoding the human genome and used them to decode the DNA of the world's oceans. We published a single scientific paper describing over six million new genes.  This one study more than doubled the number of genes known to the scientific community and the number is likely to double again in the next year.

We are now using similar approaches to identify the microbes that live inside of us.  We have identified more microbes in our guts than the 100 trillion human cells we have in our bodies. We have also catalogued the tens of thousands of microbes and viruses that are in the air we breathe. 

These modern tools of genomics and DNA sequencing are rapidly revealing to us the incredible world of microbes that we exist within and exist within us. 

Young students of science can today make more discoveries in one year than major institutions or countries could make in a decade just a short while ago.

So, what is the value of these discoveries? The answer is many things but one of the most important is a better understanding of life and its evolution on Earth. And what can we do with all this new information that is coming at an exponential pace? We can use these millions of newly discovered organisms and genes to tell us how the environment is changing as a result of human activities.

But above all I believe the best examples of disruptive technologies that could change our future are in the new fields of synthetic biology, synthetic genomics, and metabolic engineering. These fields can change the way we think about life by showing that we can use living systems to increase our chances of survival as a species.  Simply put: these area of research will enable us to create new fuels to replace oil and coal.

Imagine scientists in the near future sitting at their computers and designing the chromosome of a new organism, an organism that perhaps could produce fuels biologically, fuels like octane, diesel fuel, jet fuel even hydrogen all from sugar or even sunlight with the carbon coming from carbon dioxide.

Imagine that after designing the new chromosome, the computer directed a robot to chemically make the DNA strand encoding all that information, and that once constructed, the new chromosome would be inserted into a bacterial cell where it becomes activated causing the cell to turn into the species that the scientist designed.  And now imagine that new species in a bioreactor making millions of copies of itself and each copy is producing a new fuel from only renewable sources.  Sounds like science fiction right? Not to me, because I believe this is the future.

For the past 15 years at ever faster rates we have been digitizing biology.  By that I mean going from the analog world of biology through DNA sequencing into the digital world of the computer. I also refer to this as reading the genetic code.  The human genome is perhaps the best example of digitizing biology. Our computer databases are growing faster per day then during the first 10 years of DNA sequencing.  The databases have been filling even faster with the results of our global ocean sequencing project.  As a result we have now over 10 million genes in the public databases, the majority of which have been contributed by my teams.

We and others have been working for the past several years on the ability to go from reading the genetic code to learning how to write it.  It is now possible to design in the computer and then chemically make in the laboratory, very large DNA molecules. A few months ago we published a scientific study in the journal Science where we described the ability to take a chromosome from one bacterium and place it into a second bacterial cell.  The result was astonishing - the new DNA that we added changed the species completely from the original one into the species defined by the added DNA. You could describe this as the ultimate in identity theft.

Again, maybe this sounds like science fiction, but I think it is actually a key mechanism of evolution, that could be largely responsible for the wide range of diversity that we see.  Instead of evolution happening only due to random mutations that survived selective pressure, we can see how by adding chromosomes to or exchanged between species, that thousands of changes could happen in an instant. 

Now they can happen not just by random chance but by deliberate human design and selection.  Human thought and design and specific selection is now replacing Darwinian evolution.

One of the most significant and unique features of our research in synthetic genomics that often gets overlooked by the news media, is the long history, starting from the beginning of this work in 1995 and continuing today, of ethical review.  As with the past 30 years of molecular biology, the organisms being designed cannot survive outside of the laboratory and are subject to strict containment.  While we don't want students doing this work in their basements, this new field is stimulating an exciting new interest in biological studies. 

Right now extensively modified bacteria are being used to make food additives and industrial chemicals.  DuPont has a plant in the US state of Tennessee with four very large silos where they are using metabolically engineered bacteria to convert sugar into a new polymer, propanediol which is the key component in their stain resistant carpets and clothing.  Several teams, including my own, are modifying bacteria to make the next generation biofuels.  For example, my team has a new fuel chemical made from sugars as a starting material that has the potential to be one of the first green jet fuels.

But we don't always have to modify bacteria or design new ones.  What has occurred on Earth from Darwinian evolution is pretty amazing in that the unique metabolism of these microbial powerhouses can often provide exactly what we need.  For instance, we have a team at my institute headed by Ken Nealson that has developed microbial fuel cells using naturally occurring bacteria.  These organisms can process human and animal waste to produce electricity and or clean water. 

At my company Synthetic Genomics, we have a major program underway in collaboration with BP to see if we can use naturally occurring microbes to metabolize coal into methane which can then be harvested as natural gas.  While not a renewable source of carbon, it could provide as much as a 10 fold improvement over mining and burning coal.  We also have organisms that can convert CO2 into methane thereby providing a renewable source of fuel.

The biggest question in my mind is the one of scale.  Last year we consumed more than 83 million barrels of oil per day or 30 billion barrels during the year.  In addition we used over 3 billion tons of coal.  These are mind boggling numbers and the only way that I can see replacing oil and coal is through a widely distributed system.  If there were one million bio-refineries around the globe each one would still need to produce 17,000 liters per day. For the UK my vision would entail thousands of bio-refineries distributed around the country near where the fuel would be consumed and where the starting raw material such as cellulose would be available.  On a global scale there will be millions of new fuel producers perhaps favoring the agricultural rich developing world.  This could be the ultimate disruptive model by changing the entire infrastructure for energy production and consumption and helping us toward a carbon neutral world.

In closing:

It is my hope that we can embrace, not fear, the necessary science to help our planet.

I feel it is imperative that we begin to find ways to adapt to climate change, while at the same time working to mitigate it. Unfortunately we are already on a path toward significant change, but if we apply ourselves I believe we can find ways to create alternatives to burning oil and coal.  We need multiple simultaneous approaches to solve this problem, with the goal of net zero carbon emissions to stabilize atmospheric concentrations and ensure our survival.

These are massive challenges for each and every one of us.  For our children's future and for the future of our species and our planet I hope that we can rise to the challenge.

Thank you very much.

Almost all of the major challenges we will face as a nation in this new century, from the environment, national security and economic competitiveness to energy strategies, have a scientific or technological basis. Can a president who is not comfortable thinking about science hope to lead instead of follow? Earlier Republican debates underscored this problem. In May, when candidates were asked if they believed in the theory of evolution, three candidates said no. In the next debate Mike Huckabee explained that he was running for president of the U.S., not writing the curriculum for an eighth-grade science book, and therefore the issue was unimportant.

By Lawrence M. Krauss

LAWRENCE M. KRAUSS, professor of physics and astronomy at Case Western Reserve University and chair of the Physics Section of the American Association for the Advancement of Science, is on the steering committee of ScienceDebate2008. His most recent book is Hiding in the Mirror.

Lawrence Krauss' Edge Bio Page


The day before the most recent Democratic presidential debate, the media reported a new study demonstrating that U.S. middle-school students, even in poorly performing states, do better on math and science tests than many of their peers in Europe. The bad news is that students in Asian countries, who are likely to be our chief economic competitors in the 21st century, significantly outperform all U.S. students, even those in the highest-achieving states.

While these figures were not raised in recent Democratic or Republican debates, they reflect a major challenge for the next president: the need to guide both the public and Congress to address the problems that have produced this "science gap," as well as the serious consequences that may result from it.

America's current economic strength derives from the investments in fundamental research and technology made a generation ago. Future strength will depend upon research being done today. One might argue that many key discoveries occurred as a result of importing scientific talent. But as foreign educational systems and economies flourish, our ability to attract and keep new talent could easily erode. Even with a continued foreign influx of scientific talent, it would be foolish to expect that we can maintain our technological leadership without a solid domestic workforce as well.

Almost all of the major challenges we will face as a nation in this new century, from the environment, national security and economic competitiveness to energy strategies, have a scientific or technological basis. Can a president who is not comfortable thinking about science hope to lead instead of follow? Earlier Republican debates underscored this problem. In May, when candidates were asked if they believed in the theory of evolution, three candidates said no. In the next debate Mike Huckabee explained that he was running for president of the U.S., not writing the curriculum for an eighth-grade science book, and therefore the issue was unimportant.

Apparently many Americans agreed with him, according to polls taken shortly after the debate. But lack of interest in the scientific literacy of our next president does not mean that the issue is irrelevant. Popular ambivalence may rather reflect the fact that most Americans are scientifically illiterate. A 2006 National Science Foundation survey found that 25% of Americans did not know the earth goes around the sun.

Our president will thus have to act in part as an "educator in chief" as well as commander in chief. Someone who is not scientifically literate will find it difficult to fill this role.

This summer in Aspen, Colo., a group of scientists, journalists and business people convened at a "science summit" to discuss ways to build a growing awareness of the importance of scientific issues in government. A working group was convened to explore ways that the scientific and business communities might work together to ensure that science becomes an issue in the 2008 campaign.

This coming week another group I am a part of, ScienceDebate2008, is issuing a public call for a U.S. presidential debate devoted to science and technology. Eight Nobel Laureates, the heads of several major scientific societies, several university presidents, the chairman emeritus of Lockheed Martin and several congresspeople have already signed on to call for the debate, which would cover three broad categories: the environment, health and medicine, and science and technology policy.

Even if the American public is not currently focused on these concerns, decisions made by the next U.S. president on issues such as climate change, energy research, stem cells and nuclear proliferation will have a global impact. We owe it to the next generation to take ownership of these issues now. In spite of the ambivalence reflected in some polls, there is a popular understanding that science and technology will be essential to meet the challenges we face as a society. When reports began to surface warning that the avian flu might become a threat to humans, for example, everyone from the president down called for studies to determine how quickly the virus might mutate from birds to human beings. No one suggested that "intelligent design," for example, could provide answers.

We as a nation desperately need a more scientifically literate electorate and leadership, and a presidential debate on these subjects would be a good first step in this direction.

[First published as an OpEd piece by The Wall Street Journal, December 6, 2007]


Jonathan Haidt replies to David Sloan Wilson, Michael Shermer, Sam Harris, PZ Myers, Marc D. Hauser
on "Moral Psychology and the Misunderstanding of Religion" By Jonathan Haidt

A few weeks after the comments of Wilson, Shermer, Harris, Myers, and Hauser were posted, I had the great fortune to attend a conference at the Salk Institute with four of them (all but Hauser), and with Dan Dennett too. The conference, "Beyond Belief 2," had a provocative subtitle: "Enlightenment 2.0." The theme was that Enlightenment 1.0, which threw off the mental shackles of religion and launched the scientific revolution, was a good start. But in true enlightenment spirit, if we think well, draw on the best available research, and place no idea off limits, we can make it better. We can re-invent and re-invigorate the Enlightenment for the difficult and still-religious century we now face.

Because the Enlightenment is defined by its rejection of religious authority, religion has always had a special place in the hearts of Enlightenmenters. The evils and stupidities of religion are our raison d'etre, and our raison d'etre raisonnable. But if we hope to update the Enlightenment and increase its appeal in a world where religion still holds a bigger market share, then we must do more than examine religion rationally and scientifically, as was done in Enlightenment 1.0. For Enlightenment 2.0 we must also examine ourselves examining religion, and we must lay bare our own motives and biases. People are extraordinarily good at reasoning their way to any conclusion they want to reach, so long as there is some ambiguity in the evidence. And when we want to reach a conclusion for moral reasons — when we are analyzing people or institutions that we think are evil — we are likely to conduct biased reviews of the evidence and reach incorrect conclusions about the motives and methods of our opponents. The commentators seemed to accept my portrayal of moral psychology as a generally passionate affair in which reasoning often follows intuition, and so I take it that we all agree that those who write about religion while angry about religion should have their work checked carefully by others.

It is now clear to me that we all agree on these major points as well:

1) The New Atheists take as a primary goal the debunking of the historical and cosmological claims of the major religions.

2) The historical and cosmological claims of the major religions are in fact almost all false (as far as we can tell from historical and scientific research).

 3) The New Atheists have primary goals beyond debunking; they also want to show that religion is pernicious and that its net effects on human welfare are overwhelmingly negative. (As Hitchen's subtitle puts it, "religion poisons everything").

4) Religions do in fact have many pernicious effects on human welfare, particularly when they foster cross-group conflict and a willingness to kill (as in Harris's examples of human sacrifice).

5) The explanation for widespread human religiosity lies partly in the biological evolution of mental and emotional mechanisms that get activated by culturally evolved religious practices and institutions.

I listed these five points to make it clear that I do have some idea what the New Atheists are about, so my ignorance cannot be "absolute," as Myers charged. I also want to make it clear that I am not an apologist for religion. I used to dislike all religions, back when I thought of them as systems of belief that helped individuals understand the world and cope with the unknown. After reading Durkheim and D. S. Wilson I now think of religions first and foremost as coordination devices that bind people together into moral communities with effects that are mostly good for the members, although sometimes terrible for deviants and for neighboring groups (as Shermer and Harris noted). Whether the net effects of religion for humanity are good or bad is a complex empirical question, the answer to which varies by religion, by era, and by what terms we include in our cost/benefit analysis. (This is exactly the sort of ambiguous dataset from which it is so easy to cherry-pick evidence in favor of one's desired conclusion.) I am motivated neither to convict nor to acquit, but if religion is to be subject to trial by science, I want the trial to be fair. Until we acknowledge a latent prejudice, however, we will have trouble understanding the accused.

The social sciences have spent far too long under the spell of a belief system that helped us for a while to understand the world and cope with the unknown: methodological individualism. As Don Campbell wrote in 1994, in a critique of psychology: "Methodological individualism dominates our neighboring fields of economics, much of sociology, and all of psychology's excursions into organizational theory. This is the dogma that all human social group processes are to be explained by laws of individual behavior." I believe, with Campbell, that it is high time we broke this spell and allowed social scientists and evolutionary theorists to start looking again at groups as emergent entities that have unique properties and regulatory mechanisms.

In reading the five commentaries on my essay, it seems to me that most of the major points of disagreement go back to this difference in paradigm: Myers, Harris, and Hauser are all committed to methodological individualism, whereas Wilson and I are committed to multi-level analyses of social phenomena. Shermer, as I found out upon meeting him, is committed to nothing in advance of a full hearing. (He is the most open minded person I have ever met.) Rather than responding point by point to all five reviewers, I will raise three questions on which we seem to disagree because of our differing paradigms.

1) Can group-level adaptations evolve?

We all agree (with Dawkins and George C. Williams) that multi-level selection is possible in principle. Genes can spread either because they help individuals outcompete their within-group neighbors, or because they help groups outcompete other groups. The question is whether group-level selection ever happens in fact. Williams considered dozens of putative cases among a variety of animal species and concluded that it does not. A fleet herd of deer is really just a herd of fleet deer. Fast-runners outcompeted their slower neighbors; fast herds did not outcompete slow herds. Is the situation the same with humans? Are cooperative groups really just groups of cooperators, who inherited genes and cultural variants that let them beat out their less cooperative neighbors? Or was it generally the case, in our long tribal past, that cooperative groups were on average more successful economically, militarily, politically, and reproductively, than less cooperative groups? Stated in this way it is obvious that cooperative tribes are very different from herds of fleet deer, and that tribes really do compete. But if that was my whole argument for group-level selection then Hauser would be right to join Gould and Lewontin in their ridicule of "just-so" stories. I need to add two additional claims: 1) the variation in cooperation must be heritable (Hauser is right that I should have said this) and 2) some mechanism must exist for solving the free-rider problem — for suppressing the emergence of uncooperative variants within cooperative groups.

I follow Wilson in believing that religiosity, which makes little sense as individual-level adaptation for outcompeting one's less-religious neighbors, makes a lot of sense as a group-level adaptation for binding individuals together, solving the free-rider problem, and outcompeting less cohesive groups. And religiosity is indeed heritable. Dean Hamer may have gone too far in titling his book The God Gene, but twin studies clearly show that something in our genome strongly affects whether or not one will believe in God as an adult. It hardly seems absurd, loose, or "just so" to posit that genes that gave rise to more religiously-inclined minds co-evolved with cultural variations in beliefs, institutions, and practices that we now call religion. When new evidence (such as the heritability of religiosity) and powerful tools (such as our new ability to conceptualize gene-culture co-evolution) come along, it would be unscientific to say "oh, but the experts rejected this idea 30 years ago, let's not reconsider it now."

Hauser's main reason for rejecting group-level selection is that the gene-centered and individual-centered views have been so productive. There are "thousands upon thousands of confirmatory papers," whereas there is little empirical evidence, at present, on phenomena best explained by group-level selection. I don't doubt Hauser's numbers, but I find a close parallel to the situation in economics 20 years ago, when Robert Frank and others were trying to argue that human beings were not always selfish utility maximizers. Neo-classical economists mounted the same defense, that thousands upon thousands of studies generated by neo-classical economists supported the claim of neo-classical economics that people are rational agents who act to maximize their individual utility. But when Frank, Kahneman, Tversky, Thaler and others started looking in the right places (e.g., bargaining and ultimatum games), they found that people had moral motives as well as monetary motives. We now have a thriving field of behavioral economics. As we now start to look in the right places for group-level effects (e.g., mechanisms that bind people together and suppress free-riders), evidence for group-level selection is beginning to emerge. (This evidence is reviewed in D. S. Wilson and E. O. Wilson's recent paper in Quarterly review of Biology; see also recent work by Peter Turchin on historical dynamics, and by Martin Nowak on cooperation as one of the three basic forces in evolution, alongside mutation and selection).

Myers says that "we don't have any evidence that religion is adaptive in any way." But even for a methodological individualist, who insists that all selection occurs within groups, this is an odd claim. Religious people live longer, healthier lives and have more children than do non-religious people. What more direct evidence could there be of Darwinian adaptation? (I don't know that religious people have always been more fertile, but the modern case must count as some evidence.) And if we drop the individualist requirement and allow ourselves to look at groups as entities, then it's hard to find anything more adaptive than religion at binding large groups together and suppressing individual selfishness for the good of the group.

2) Are religious people really happier and more charitable?

Myers objects to my claim that religious people are happier and more generous than secular folk. He points out that, given the recent domination of America by the religious right, he would be "gobsmacked" if surveys found secular folk to be happier than believers. But surveys show that it doesn't matter which party is in power; conservatives and religious believers have been happier for as long as the surveys have been done. To the extent that circumstances matter for happiness, they are local circumstances mostly involving relationships, not geopolitical ones.
Myers further objects that atheists should not be expected to be as generous because it is hard for them to find non-religious charities. Most of the appeals he gets come from religious groups that he does not trust. He and I seem to have gotten ourselves on radically different mailing lists — I get no such appeals. But even if most atheists receive mostly religious appeals, as long as they also get a few each year from secular groups (Oxfam, United Way, SPCA) and have access to the internet, the lack of opportunities for charity cannot be used as an excuse for not giving.
Hauser joins Myers in questioning the interpretation of the religion-charity relationship. Hauser resists giving moral credit to religious givers: "Perhaps they are better rule followers, worried about God's lightening bolt! Perhaps they are better conditioned by their religious institution." For a methodological and moral individualist, these are valid concerns. In a contractually-based morality the individual is the only real unit of value, and moral credit for charity accrues to individuals primarily when they act 1) freely, with no social pressure 2) to relieve the suffering or oppression 3) of strangers. With these criteria, Hauser and Myers are right that most of the charity work of religious people should be excluded. But the really stunning point of Brooks's book is that even when we exclude all religious giving, religious believers still give more money to secular charities than do atheists. (I note that Brooks excludes giving to political causes, where perhaps atheists give more.)
But what if we drop the methodological individualist criteria and just ask about the degree to which religion makes people divert their time, money, and attention away from themselves? The charitable imbalance between atheists and believers now becomes enormous (according to Brooks), and the analogy I and others have made between religious communities and beehives becomes more useful. All that time and money given to one's own church is like the "altruism" of bees who toil to build their common hive; all that time and money given to build churches in faraway lands is like the efforts of bees to found new colonies. Religions, generally speaking, work to suppress our inner chimp and bring out our inner bee. But methodological individualists, who deny group-level selection and shun group-level analyses, find it hard to believe that people could be happier or more generous when they live in bee-like ways than when they live on their own, outside of any hive.

3) Are beehive (binding) moralities good?
My academic path began in high school when, as a young atheist, I read Waiting for Godot and plunged into an existential depression. If there really was no God, then our lives seemed to me as meaningless as those of Vladimir and Estragon.

I have gained new respect for religion as I came to see it as a complex of co-evolved genes and cultural innovations for binding people together and imbuing them with a sense of community and collective purpose, immune to the sense of pointlessness and isolation that engulfed me in high school. To the extent that religions really accomplish this goal they are good, at least from a straight utilitarian human-welfare perspective. But whenever I speak or write about these good effects of religion, it is useful to have Sam Harris reminding us of the costs, which I cannot deny. If religion is in part an adaptation for successful intergroup competition, then the suppression of selfishness within groups is purchased by the increased likelihood of righteous nastiness across groups and toward internal deviants. Harris asks the important question: "Are certain conceptions of morality especially good at binding community together, but incompatible with modernity?" I agree with him that the answer is yes.

An important distinction I should have made in my essay is between fundamentalist and non-fundamentalist religions. Fundamentalism as I see it is about making one principle or small set of principles fundamental and sacred, and then applying these principles in an absolute, uncompromising, inflexible, I'm-sure-I'm-right and consequences-be-damned way to a complex world. On this definition the religious right and radical Islam are fundamentalist movements, and they are incompatible with modernity, democracy, and the ineradicable diversity of all Western societies. If I could wave a magic wand and have all fundamentalists converted into non-fundamentalists overnight, I would do so and be confident the world would become a better place.

But if I could wave that magic wand a second time and have all believers converted into atheists, would that be a good thing? The New Atheists say yes, and they hope that their books will be that magic wand. I'm more cautious. I used to wish that all fraternities and major sports teams would disappear from my university — I thought of them as tribal institutions that brought out the ugly and sometimes violent side of young people. But after talking with athletes, fraternity members, and fundraisers I realize that these institutions create powerful feelings of belonging which have enormous benefits for the participants while making them fiercely loyal and extraordinarily generous later on to the University of Virginia. Fraternities and sports teams contribute greatly to the strong school spirit at UVA, and to our rapidly growing endowment. All students benefit from these externalities.

My new view, drawing on work in cultural psychology, is that there are three basic ways of being and living: dependent, independent, and interdependent. We all agree that being chronically dependent brings out the worst in people — laziness, passivity, and hopelessness. But is it better to be independent or interdependent? I think we educated, mobile cosmopolitans idealize independence, which maximizes our freedom and creativity. We raise our kids to be as self-sufficient as possible. But when you don't expect to need others, you are less likely to be generous to others.

Religious communities, in contrast, idealize interdependence and try to raise their kids that way. They want them to be enmeshed in extended kin networks and congregations where everyone can ask for help from anyone, and everyone is expected to give such help. I believe this is why religious people are so much more generous than secular folk. Interdependence demands greater openness to others, greater willingness to put your own projects on hold and divert your efforts toward others. When hurricane Katrina struck, religious groups across the country organized quickly to send volunteers and supplies. Like fraternities, religions may generate many positive externalities, including charity, social capital (based on shared trust), and even team spirit (patriotism). If all religious people lost their faith overnight and abandoned their congregations, I think the net results would probably be bad, at least in America where (in contrast to European nations) our enormous size, short history, great diversity, and high mobility make it harder for us to overcome individualism and feel that we are all part of one community.

In conclusion, I believe that Enlightenment 2.0 requires Morality 2.0: more cognizant of the limitations of reason, more open to multilevel approaches in which groups are sometimes units of analysis, and more humble in its assertion that the individualist and contractualist morality of the scientific community is right, and is right for everyone.


On "Taking Science on Faith" By Paul C. Davies

Here is my letter to the editor to the New York Times about Davies' op-ed, which they chose not to print, perhaps because it was too blunt for their taste:

To the editor:

Paul Davies' claim (op-ed, Nov. 24) that "both religion and science are founded on faith" is based on astoundingly sloppy reasoning.Science is, indeed, founded on the working hypothesis—one amply borne out by four centuries of scientific practice—that the world, or at least some aspects of it, is ordered in a stable and intelligible way. But that tentative and partly testable working hypothesis is a far cry from religions' reliance on sacred texts and personal revelations. To characterize these radically dissimilar endeavors as both based on "faith" is to point out a superficial commonality while obscuring the fundamental difference. And at a time when humanity is wracked by conflict between incompatible versions of faith—in the genuine sense of the term—to muddy the distinction between religion and science is worse than philosophically misguided: it is irresponsible.


Paul Davies responds to Jerry Coyne, Nathan Myhrvold, Lawrence Krauss, Scott Atran, Sean Carroll, Jeremy Bernstein, PZ Myers, Lee Smolin, John Horgan, Alan Sokal On "Taking Science on Faith" By Paul C. Davies

I was dismayed at how many of my detractors completely misunderstood what I had written. Indeed, their responses bore the hallmarks of a superficial knee-jerk reaction to the sight of the words "science" and "faith" juxtaposed.

The most common trap my critics have fallen into is in conflating the explanation of natural phenomena using the laws of physics, with an explanation of the laws themselves. I am not suggesting that the application of science is a matter of faith. Doing science involves employing testable hypotheses, refining theories and conducting experiments — in stark contrast to the practice of religion. The scientific method is the most reliable path to truth we know, and there is no more committed or passionate a scientist than I. Yes, "science works" as John Horgan points out. It is tested again and again as a description of nature. We are all agreed on that point. But it isn't the point I was trying to make. My argument refers, not to the scientific method, but to the underlying lawfulness of the universe itself, which raises questions such as where the laws come from, why they have the form that they do, and whether there is anything peculiar about the actual laws of the universe (such as being "fine-tuned" for life), as opposed to other possible laws. The orthodox position (and the one I set out to challenge in my book) is that the universe is governed by a fixed set of laws in the form of infinitely precise mathematical relationships imprinted on the universe from its birth. In addition, it is assumed that the physical world is affected by the laws, but the laws are completely impervious to what happens in the universe — they are immutable. It is not hard to see where this picture comes from: it is inherited from monotheism, which asserts that a rational being designed the universe according to a set of perfect laws. And the asymmetry between immutable laws and contingent states mirrors the asymmetry between God and nature: the universe depends utterly on God for its existence whereas God's existence does not depend on the universe. Historians of science are well aware that Newton and his contemporaries believed that in doing science they were uncovering the divine plan for the universe in the form of its underlying mathematical order. I am depressed that reminding scientists of this well-known historical fact should elicit such a shock-horror response. As Scott Atran points out, the argument that science is based on faith is not new. Evidently Western society is so steeped in monotheism that the monotheistic world view, which was appropriated by science, is now regarded as "obvious" and "natural." As a result, many scientists are unaware of its theological origin.  Nor do they stop to think about the sweeping hidden assumptions they adopt when they subscribe to that scientific/theological world view, assumptions that are in fact are not shared by most other cultures.

Not all scientists envisage the laws of nature in the theological manner I have described, however. One person who evidently doesn't is P.Z. Myers, who declares his a lack of faith in science and simply takes science "as it comes." I have found that his is a familiar position among biologists, for whom contingency as opposed to law looms so large in explanation. Unfortunately, Myers goes on to attribute to me precisely the point of view I am seeking to refute: "That Davies seems to believe that order must rule everywhere and at every level is a stronger presupposition than is warranted by a scientific approach, and sounds remarkably theological." Well, yes, that's the whole point of my article! It is theological — but it is nevertheless the orthodox view among theoretical physicists, especially those working on the search for a unified theory. Such physicists believe there are perfect laws "out there", existing in some Platonic realm, even if the laws we find in our textbooks today are merely approximations to what Steven Weinberg calls "the final theory". And that is the position that, contrary to Myers' statement, I seek to challenge in my book. In doing so, I encountered fierce opposition from my physics colleagues. For example, when I suggested in my book that infinitely precise mathematical laws might be an unjustified idealization, i.e. that there might be an intrinsic uncertainty or flexibility in the laws, many of my physics colleagues were aghast at this heresy. Jerry Coyne, in his response to my article, asks, "What do we [orthodox scientists] believe to be true without evidence?" Well, how about belief in infinitely precise laws which incorporate real numbers and differentiability? Show me the evidence for that. Or, to take another well-known example, laws that transcend the physical universe and exist in some sense prior to it, because the said laws are intended to explain the origin of the universe. Many cosmologists believe in such laws, which must be accepted without explanation or testability, as the basis of a scientific theory of cosmogenesis.

My article pointed out that the widespread belief in immutable perfect transcendent prior laws underpinning the physical universe, while not necessarily wrong, is nevertheless held as an act of faith, similar in character to belief in an all-perfect divine lawgiver. Let me be clear about the sense in which I am using the word faith here. Obviously faith in the laws of physics isn't on a par with "faith" in the popular religious sense (such as belief in miracles, prophecy, the bible as historical fact, etc., all of which I personally regard as completely ridiculous). Rather, in using the word faith I refer to the metaphysical framework, shared by monotheism and science (but not by many other cultures), of a rational ground that underpins physical existence. It is the shared faith that we live in a universe that is coherent, a universe that manifests a specific mathematical scheme of things, a universe that is, at least in part, intelligible to sentient mortals. These tacit assumptions running through science, that stem from monotheism, can all be challenged. The universe doesn't have to be that way! But most scientists believe it is that way.

Because the monotheistic world view pervading science is so deeply entrenched, asking where the laws of physics come from or why they have the form they do is frowned upon. Many respondents to my article ticked me off for venturing into such murky waters, or for expecting there to even exist an answer. I am grateful to Sean Carroll for so cogently expressing the orthodox view among physicists that the laws of physics must simply be accepted as a brute fact — that is, they exist without explanation, for no reason. "That's just how things are," writes Carroll. "There is a chain of explanations concerning things that happen in the universe, which ultimately reaches to the fundamental laws of nature and stops." For Carroll, as for many scientists, unexplained laws are thus are the starting point of scientific reasoning, the levitating superturtle at the bottom that holds up the whole tower, just as God is the levitating superturtle that holds up physical existence in monotheism. After 30 years of listening to sterile bickering in the science/religion debate I am utterly bored with the refrain from each side that, in effect, "my superturtle is better than your superturtle." So I have tried to elevate the level of discussion and move on.

To achieve progress, I set out to see how far we can go in describing the deepest properties of the physical universe without appealing to anything outside it — such as an unexplained transcendent god, an unexplained set of magically-imposed Platonic mathematical laws, or infinitely many unseen alternative universes. I concede that, so long as we are stuck with human modes of thought, we will ultimately have to accept something on faith, but I see no reason to stop with the laws of physics. So I question the idealized concept of immutable perfect laws that must simply be accepted as a brute fact — "on faith" (since we can test the laws only to finite precision). The time has come to seek a theory of the laws, to bring the laws of physics within the scope of scientific inquiry, and if possible to explain their intelligibility, their "unreasonable" mathematical efficacy and their celebrated (and baffling) bio-friendliness. A possible way to formulate a theory of laws was mooted thirty years ago by John Wheeler, who abandoned the traditional theological notion of immutable laws "cast in tablets of stone from everlasting to everlasting."  In my book I have sought to extend Wheeler's ideas in the light of recent work in the foundations of quantum mechanics, the theory of computation and holographic cosmology. Another possible approach to a theory of laws has been developed by Lee Smolin, and is mentioned briefly in his response to my article.

My interest in pursuing this project is to critically examine ultimate explanations of existence, for which there is a long tradition within religion, and a rather short one within science. I plead guilty to Lawrence Krauss' complaint that I am sidestepping some hugely important issues, such as the moral dimension of religious faith, the tragedy of human existence and suffering, and the question of purpose in the universe. My concern is admittedly with a restricted physics/cosmology agenda, as that is the only area in which I can claim some modest authority. However, the conceptual framework I am developing can accommodate a universe with something like "purpose," albeit one that is inherent in, and emergent with, the universe, rather than imposed upon it from without.

30 November 2007

Stocking-fillers: A seasonal run on the ideas bank
By Boyd Tonkin

One of the best jokes in this year's crop of upmarket stocking-filler titles is a wholly inadvertent one. In the sparky and provoking What Are You Optimistic About? (Simon & Schuster, £12.99), John Brockman — literary agent to the planet's biggest brains and guv'nor of the ever-stimulating Edge website — asks almost 150 scientists, seers and other gurus (from Steven Pinker to Brian Eno) about their reasons to be cheerful. And what subject strikes hope into the heart of Old Etonian zoologist and (now retired) amateur banker Matt Ridley, who as chairman of the board oversaw the Northern Rock train-wreck? "The future. That's what I'm optimistic about." Thank you, the Hon Matt, and I hope you enjoyed the £24bn that our little Christmas whip-round raised for you.

Ridley aside, Brockman's compilation radiates bright ideas. Let's hope that the various upbeat views on halting climate change prevail soon enough to justify Walter Isaacson's faith in the prospects of "print as a technology". If not, then we may not see many more seasons of Nordic forests felled to manufacture loo-bound volumes stuffed with short-breathed snippets. ...


Paperback—UK £8.99, 352 pp
Free Press, UK

November 5, 2007

Paperback — US
$14.95 400 pp
Harper Perennial
November 1, 2007

WHAT ARE YOU OPTIMISTIC ABOUT?: Today's Leading Thinkers on Why Things Are Good and Getting Better With an Introduction by Daniel C. Dennett, Edited By John Brockman

"Danger – brilliant minds at work...A brilliant book: exhilarating, hilarious, and chilling." The Evening Standard (London)

Paperback—UK £8.99, 352 pp
Free Press, UK

Paperback — US
$13.95, 336 pp
Harper Perennial

WHAT IS YOUR DANGEROUS IDEA? Today's Leading Thinkers on the Unthinkable With an Introduction by STEVEN PINKER and an Afterword by RICHARD DAWKINS Edited By JOHN BROCKMAN

"A selection of the most explosive ideas of our age." Sunday Herald "Provocative" The Independent "Challenging notions put forward by some of the world’s sharpest minds" Sunday Times "A titillating compilation" The Guardian "Reads like an intriguing dinner party conversation among great minds in science" Discover

Edge Foundation, Inc. is a nonprofit private operating foundation under Section 501(c)(3) of the Internal Revenue Code.

John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: [email protected]
Copyright © 2007 By
Edge Foundation, Inc
All Rights Reserved.