< previous

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |10 | 11 | 12

next >


2006

"WHAT IS YOUR DANGEROUS IDEA?"


Printer version

CONTRIBUTORS


Alun Anderson

Philip W. Anderson

Scott Atran

Mahzarin Banaji

Simon Baron-Cohen

Samuel Barondes

Gregory Benford

Jesse Bering

Jeremy Bernstein

Jamshed Bharucha

Susan Blackmore

Paul Bloom

David Bodanis

Stewart Brand

Rodney Brooks

David Buss

Philip Campbell

Leo Chalupa

Andy Clark

Gregory Cochran
Jerry Coyne

M. Csikszentmihalyi

Richard Dawkins

Paul Davies

Stanislas Deheane

Daniel C. Dennett
Keith Devlin
Jared Diamond
Denis Dutton
Freeman Dyson
George Dyson
Juan Enriquez

Paul Ewald

Todd Feinberg

Eric Fischl

Helen Fisher

Richard Foreman

Howard Gardner

Joel Garreau

David Gelernter

Neil Gershenfeld

Danie Gilbert

Marcelo Gleiser

Daniel Goleman

Brian Goodwin

Alison Gopnik

April Gornik

John Gottman

Brian Greene

Diane F. Halpern

Haim Harari

Judith Rich Harris

Sam Harris

Marc D. Hauser

W. Daniel Hillis

Donald Hoffman

Gerald Holton
John Horgan

Nicholas Humphrey

Piet Hut

Marco Iacoboni

Eric R. Kandel

Kevin Kelly

Bart Kosko

Stephen Kosslyn
Kai Krause
Lawrence Krauss

Ray Kurzweil

Jaron Lanier

David Lykken

Gary Marcus
Lynn Margulis
Thomas Metzinger
Geoffrey Miller

Oliver Morton

David G. Myers

Michael Nesmith

Randolph Nesse

Richard E. Nisbett

Tor Nørretranders

James O'Donnell

John Allen Paulos

Irene Pepperberg

Clifford Pickover

Steven Pinker

David Pizarro

Jordan Pollack

Ernst Pöppel

Carolyn Porco

Robert Provine

VS Ramachandran

Martin Rees

Matt Ridley

Carlo Rovelli

Rudy Rucker

Douglas Rushkoff

Karl Sabbagh

Roger Schank

Scott Sampson

Charles Seife

Terrence Sejnowski

Martin Seligman

Robert Shapiro
Rupert Sheldrake

Michael Shermer

Clay Shirky

Barry Smith

Lee Smolin

Dan Sperber

Paul Steinhardt

Steven Strogatz
Leonard Susskind

Timothy Taylor

Frank Tipler

Arnold Trehub

Sherry Turkle

J. Craig Venter

Philip Zimbardo

SCOTT ATRAN
Anthropologist, University of Michigan; Author, In God's We Trust

Science encourages religion in the long run (and vice versa)

Ever since Edward Gibbon's Decline and Fall of the Roman Empire, scientists and secularly-minded scholars have been predicting the ultimate demise of religion. But, if anything, religious fervor is increasing across the world, including in the United States, the world's most economically powerful and scientifically advanced society. An underlying reason is that science treats humans and intentions only as incidental elements in the universe, whereas for religion they are central. Science is not particularly well-suited to deal with people's existential anxieties, including death, deception, sudden catastrophe, loneliness or longing for love or justice. It cannot tell us what we ought to do, only what we can do. Religion thrives because it addresses people's deepest emotional yearnings and society's foundational moral needs, perhaps even more so in complex and mobile societies that are increasingly divorced from nurturing family settings and long familiar environments.

From a scientific perspective of the overall structure and design of the physical universe:

1. Human beings are accidental and incidental products of the material development of the universe, almost wholly irrelevant and readily ignored in any general description of its functioning.

Beyond Earth, there is no intelligence — however alien or like our own — that is watching out for us or cares. We are alone.

2. Human intelligence and reason, which searches for the hidden traps and causes in our surroundings, evolved and will always remain leashed to our animal passions — in the struggle for survival, the quest for love, the yearning for social standing and belonging.

This intelligence does not easily suffer loneliness, anymore than it abides the looming prospect of death, whether individual or collective.

Religion is the hope that science is missing (something more in the endeavor to miss nothing).

But doesn't religion impede science, and vice versa? Not necessarily. Leaving aside the sociopolitical stakes in the opposition between science and religion (which vary widely are not constitutive of science or religion per se — Calvin considered obedience to tyrants as exhibiting trust in God, Franklin wanted the motto of the American Republic to be "rebellion against tyranny is obedience to God"), a crucial difference between science and religion is that factual knowledge as such is not a principal aim of religious devotion, but plays only a supporting role. Only in the last decade has the Catholic Church reluctantly acknowledged the factual plausibility of Copernicus, Galileo and Darwin. Earlier religious rejection of their theories stemmed from challenges posed to a cosmic order unifying the moral and material worlds. Separating out the core of the material world would be like draining the pond where a water lily grows. A long lag time was necessary to refurbish and remake the moral and material connections in such a way that would permit faith in a unified cosmology to survive.


SAM HARRIS
Neuroscience Researcher; Author, The End of Faith


Science Must Destroy Religion

Most people believe that the Creator of the universe wrote (or dictated) one of their books. Unfortunately, there are many books that pretend to divine authorship, and each makes incompatible claims about how we all must live. Despite the ecumenical efforts of many well-intentioned people, these irreconcilable religious commitments still inspire an appalling amount of human conflict.

In response to this situation, most sensible people advocate something called "religious tolerance." While religious tolerance is surely better than religious war, tolerance is not without its liabilities. Our fear of provoking religious hatred has rendered us incapable of criticizing ideas that are now patently absurd and increasingly maladaptive. It has also obliged us to lie to ourselves — repeatedly and at the highest levels — about the compatibility between religious faith and scientific rationality.

The conflict between religion and science is inherent and (very nearly) zero-sum. The success of science often comes at the expense of religious dogma; the maintenance of religious dogma always comes at the expense of science. It is time we conceded a basic fact of human discourse: either a person has good reasons for what he believes, or he does not. When a person has good reasons, his beliefs contribute to our growing understanding of the world. We need not distinguish between "hard" and "soft" science here, or between science and other evidence-based disciplines like history. There happen to be very good reasons to believe that the Japanese bombed Pearl Harbor on December 7th, 1941. Consequently, the idea that the Egyptians actually did it lacks credibility. Every sane human being recognizes that to rely merely upon "faith" to decide specific questions of historical fact would be both idiotic and grotesque — that is, until the conversation turns to the origin of books like the bible and the Koran, to the resurrection of Jesus, to Muhammad's conversation with the angel Gabriel, or to any of the other hallowed travesties that still crowd the altar of human ignorance.

Science, in the broadest sense, includes all reasonable claims to knowledge about ourselves and the world. If there were good reasons to believe that Jesus was born of a virgin, or that Muhammad flew to heaven on a winged horse, these beliefs would necessarily form part of our rational description of the universe. Faith is nothing more than the license that religious people give one another to believe such propositions when reasons fail. The difference between science and religion is the difference between a willingness to dispassionately consider new evidence and new arguments, and a passionate unwillingness to do so. The distinction could not be more obvious, or more consequential, and yet it is everywhere elided, even in the ivory tower.

Religion is fast growing incompatible with the emergence of a global, civil society. Religious faith — faith that there is a God who cares what name he is called, that one of our books is infallible, that Jesus is coming back to earth to judge the living and the dead, that Muslim martyrs go straight to Paradise, etc. — is on the wrong side of an escalating war of ideas. The difference between science and religion is the difference between a genuine openness to fruits of human inquiry in the 21st century, and a premature closure to such inquiry as a matter of principle. I believe that the antagonism between reason and faith will only grow more pervasive and intractable in the coming years. Iron Age beliefs — about God, the soul, sin, free will, etc. — continue to impede medical research and distort public policy. The possibility that we could elect a U.S. President who takes biblical prophesy seriously is real and terrifying; the likelihood that we will one day confront Islamists armed with nuclear or biological weapons is also terrifying, and growing more probable by the day. We are doing very little, at the level of our intellectual discourse, to prevent such possibilities.

In the spirit of religious tolerance, most scientists are keeping silent when they should be blasting the hideous fantasies of a prior age with all the facts at their disposal.

To win this war of ideas, scientists and other rational people will need to find new ways of talking about ethics and spiritual experience. The distinction between science and religion is not a matter of excluding our ethical intuitions and non-ordinary states of consciousness from our conversation about the world; it is a matter of our being rigorous about what is reasonable to conclude on their basis. We must find ways of meeting our emotional needs that do not require the abject embrace of the preposterous. We must learn to invoke the power of ritual and to mark those transitions in every human life that demand profundity — birth, marriage, death, etc. — without lying to ourselves about the nature of reality.

I am hopeful that the necessary transformation in our thinking will come about as our scientific understanding of ourselves matures. When we find reliable ways to make human beings more loving, less fearful, and genuinely enraptured by the fact of our appearance in the cosmos, we will have no need for divisive religious myths. Only then will the practice of raising our children to believe that they are Christian, Jewish, Muslim, or Hindu be broadly recognized as the ludicrous obscenity that it is. And only then will we stand a chance of healing the deepest and most dangerous fractures in our world.


PAUL STEINHARDT
Albert Einstein Professor of Science, Princeton University


It's a matter of time


For decades, the commonly held view among scientists has been that space and time first emerged about fourteen billion years ago in a big bang. According to this picture, the cosmos transformed from a nearly uniform gas of elementary particles to its current complex hierarchy of structure, ranging from quarks to galaxy superclusters, through an evolutionary process governed by simple, universal physical laws. In the past few years, though, confidence in this point of view has been shaken as physicists have discovered finely tuned features of our universe that seem to defy natural explanation.

The prime culprit is the cosmological constant, which astronomers have measured to be exponentially smaller than naïve estimates would predict. On the one hand, it is crucial that the cosmological constant be so small or else it would cause space to expand so rapidly that galaxies and stars would never form. On the other hand, no theoretical mechanism has been found within the standard Big Bang picture that would explain the tiny value.

Desperation has led to a "dangerous" idea: perhaps we live in an anthropically selected universe. According to this view, we live in a multiverse (a multitude of universes) in which the cosmological constant varies randomly from one universe to the next. In most universes, the value is incompatible with the formation of galaxies, planets, and stars. The reason why our cosmological constant has the value it does is because it it is one of the rare examples in which the value happens to lie in the narrow range compatible with life.

This is the ultimate example of "unintelligent design": the multiverse tries every possibility with reckless abandon and only very rarely gets things "right;" that is, consistent with everything we actually observe. It suggests that the creation of unimaginably enormous volumes of uninhabitable space is essential to obtain a few rare habitable spaces.

I consider this approach to be extremely dangerous for two reasons. First, it relies on complex assumptions about physical conditions far beyond the range of conceivable observation so it is not scientifically verifiable. Secondly, I think it leads inevitably to a depressing end to science. What is the point of exploring further the randomly chosen physical properties in our tiny corner of the multiverse if most of the multiverse is so different. I think it is far too early to be so desperate. This is a dangerous idea that I am simply unwilling to contemplate.

My own "dangerous" idea is more optimistic but precarious because it bucks the current trends in cosmological thinking. I believe that the finely tuned features may be naturally explained by supposing that our universe is much older than we have imagined. With more time, a new possibility emerges. The cosmological "constant" may not be constant after all. Perhaps it is varying so slowly that it only appears to be constant. Originally it had the much larger value that we would naturally estimate, but the universe is so old that its value has had a chance to relax to the tiny value measured today. Furthermore, in several concrete examples, one finds that the evolution of the cosmological constant slows down as its value approaches zero, so most of the history of the universe transpires when its value is tiny, just as we find today.

This idea that the cosmological constant is decreasing has been considered in the past. In fact, physically plausible slow-relaxation mechanisms have been identified. But the timing was thought to be impossible. If the cosmological constant decreases very slowly, it causes the expansion rate to accelerate too early and galaxies never form. If it decreases too quickly, the expansion rate never accelerates, which is inconsistent with recent observations. As long as the cosmological constant has only 14 billion years to evolve, there is no feasible solution.

But, recently, some cosmologists have been exploring the possibility that the universe is exponentially older. In this picture, the evolution of the universe is cyclic. The Big Bang is not the beginning of space and time but, rather, a sudden creation of hot matter and radiation that marks the transition from one period of expansion and cooling to the next cycle of evolution. Each cycle might last a trillion years, say. Fourteen billion years marks the time since the last infusion of matter and radiation, but this is brief compared to the total age of the universe. Each cycle lasts about a trillion years and the number of cycles in the past may have been ten to the googol power or more!

Then, using the slow relaxation mechanisms considered previously, it becomes possible that the cosmological constant decreases steadily from one cycle to the next. Since the number of cycles is likely to be enormous, there is enough time for the cosmological constant to shrink by an exponential factor, even though the decrease over the course of any one cycle is too small to be undetectable. Because the evolution slows down as the cosmological constant decreases, this is the period when most of the cycles take place. There is no multiverse and there is nothing special about our region of space — we live in a typical region at a typical time.

Remarkably, this idea is scientifically testable. The picture makes explicit predictions about the distribution of primordial gravitational waves and variations in temperature and density. Also, if the cosmological constant is evolving at the slow rate suggested, then ongoing attempts to detect a temporal variation should find no change. So, we may enjoy speculating now about which dangerous ideas we prefer, but ultimately it is Nature that will decide if any of them is right. It is just a matter of time.


NEIL GERSHENFELD
Physicist; Director, Center for Bits and Atoms, MIT; Author, Fab


Democratizing access to the means of invention

The elite temples of research (of the kind I've happily spent my career in) may be becoming intellectual dinosaurs as a result of the digitization and personalization of fabrication.

Today, with about $20k in equipment it's possible to make and measure things from microns and microseconds on up, and that boundary is quickly receding. When I came to MIT that was hard to do. If it's no longer necessary to go to MIT for its facilities, then surely the intellectual community is its real resource? But my colleagues (and I) are always either traveling or over-scheduled; the best way for us to see each other is to go somewhere else. Like many people, my closest collaborators are in fact distributed around the world.

The ultimate consequence of the digitization of first communications, then computation, and now fabrication, is to democratize access to the means of invention. The third world can skip over the first and second cultures and go right to developing a third culture. Rather than today's model of researchers researching for researchees, the result of all that discovery has been to enable a planet of creators rather than consumers.


W.DANIEL HILLIS
Physicist, Computer Scientist; Chairman, Applied Minds, Inc.; Author, The Pattern on the Stone


The idea that we should all share our most dangerous ideas

I don't share my most dangerous ideas. Ideas are the most powerful forces that we can unleash upon the world, and they should not be let loose without careful consideration of their consequences. Some ideas are dangerous because they are false, like an idea that one race of humans is more worthy that another, or that one religion has monopoly on the truth. False ideas like these spread like wildfire, and have caused immeasurable harm. They still do. Such false ideas should obviously not be spread or encouraged, but there are also plenty of trues idea that should not be spread: ideas about how to cause terror and pain and chaos, ideas of how to better convince people of things that are not true.

I have often seen otherwise thoughtful people so caught up in such an idea that they seem unable to resist sharing it. To me, the idea that we should all share our most dangerous ideas is, itself, a very dangerous idea. I just hope that it never catches on.


JARON LANIER
Computer Scientist and Musician

Homuncular Flexibility

The homunculus is an approximate mapping of the human body in the cortex. It is often visualized as a distorted human body stretched along the top of the human brain. The tongue, thumbs, and other body parts with extra-rich brain connections are enlarged in the homunculus, giving it a vaguely obscene, impish character.

Long ago, in the 1980s, my colleagues and I at VPL Research built virtual worlds in which more than one person at a time could be present. People in a shared virtual world must be able to see each other, as well as use their bodies together, as when two people lift a large virtual object or ride a tandem virtual bicycle. None of this would be possible without virtual bodies.

It was a self-evident and inviting challenge to attempt to create the most accurate possible bodies, given the crude state of the technology at the time. To do this, we developed full body suits covered in sensors. A measurement made on the body of someone wearing one of these suits, such as an aspect of the flex of a wrist, would be applied to control a corresponding change in a virtual body. Before long, people were dancing and otherwise goofing around in virtual reality.

Of course there were bugs. I distinctly remember a wonderful bug that caused my hand to become enormous, like a web of flying skyscrapers. As is often the case, this accident led to an interesting discovery.

It turned out that people could quickly learn to inhabit strange and different bodies and still interact with the virtual world. I became curious how weird the body could get before the mind would become disoriented. I played around with elongated limb segments, and strange limb placement. The most curious experiment involved a virtual lobster (which was lovingly modeled by Ann Lasko.) A lobster has a trio of little midriff arms on each side of its body. If physical human bodies sprouted corresponding limbs, we would have measured them with an appropriate body suit and that would have been that.

I assume it will not come as a surprise to the reader that the human body does not include these little arms, so the question arose of how to control them. The answer was to extract a little influence from each of many parts of the physical body and merge these data streams into a single control signal for a given joint in the extra lobster limbs. A touch of human elbow twist, a dash of human knee flex; a dozen such movements might be mixed to control the middle join of little left limb #3. The result was that the principle elbows and knees could still control their virtual counterparts roughly as before, while still contributing to the control of additional limbs.

Yes, it turns out people can learn to control bodies with extra limbs!

The biologist Jim Bower, when considering this phenomenon, commented that the human nervous system evolved through all the creatures that preceded us in our long evolutionary line, which included some pretty strange creatures, if you go back far enough. Why wouldn't we retain some homuncular flexibility with a pedigree like that?

The original experiments of the 1980s were not carried out formally, but recently it has become possible to explore the phenomenon in a far more rigorous way. Jeremy Bailenson at Stanford has created a marvelous new lab for studying multiple human subjects in high-definition shared virtual worlds, and we are now planning to repeat, improve, and extend these experiments. The most interesting questions still concern the limits to homuncular flexibility. We are only beginning the project of mapping how far it can go.

Why is homuncular flexibility a dangerous idea? Because the more flexible the human brain turns out to be when it comes to adapting to weirdness, the weirder a ride it will be able to keep up with as technology changes in the coming decades and centuries.

Will kids in the future grow up with the experience of living in four spatial dimensions as well as three? That would be a world with a fun elementary school math curriculum! If you're most interested in raw accumulation of technological power, then you might be not find this so interesting, but if you think in terms of how human experience can change, then this is the most fascinating stuff there is.

Homuncular flexibility isn't the only source of hints about how weird human experience might get in the future. There also questions related to language, memory, and other aspects of cognition, as well as hypothetical prospects for engineering changes in the brain. But in this one area, there's an indication of high weirdness to come, and I find that prospect dangerous, but in a beautiful and seductive way. "Thrilling" might be a better word.


GARY MARCUS
Psychologist, New York University; Author, The Birth of the Mind

Minds, genes, and machines

Brains exist primarily to do two things, to communicate (transfer information) and compute. This is true in every creature with a nervous system, and no less true in the human brain. In short, the brain is a machine. And the basic structure of that brain, biological substrate of all things mental, is guided in no small part by information carried in the DNA.

In the twenty-first century, these claims should no longer be controversial. With each passing day, techniques like magnetic resonance imaging and electrophysiological recordings from individual neurons make it clearer that the business of the brain is information processing, while new fields like comparative genomics and developmental neuroembryology remove any possible doubt that genes significantly influence both behavior and brain.

Yet there are many people, scientists and lay persons alike, who fear or wish to deny these notions, to doubt our even reject the idea that the mind is a machine, and that it is significantly (though of course not exclusively) shaped by genes. Even as the religious right prays for Intelligent Design, the academic left insinuates that merely discussing the idea of innateness is dangerous, as in a prominent child development manifesto that concluded:

If scientists use words like "instinct" and "innateness" in reference to human abilities, then we have a moral responsibility to be very clear and explicit about what we mean. If our careless, underspecified choice of words inadvertently does damage to future generations of children, we cannot turn with innocent outrage to the judge and say "But your Honor, I didn't realize the word was loaded.

A new academic journal called "Metascience" focuses on when extra-scientific considerations influence the process of science. Sadly, the twin questions of whether we are machines, and whether we are constrained significantly by our biology, very much fall into this category, questions where members of the academy (not to mention fans of Intelligent Design) close their minds.

Copernicus put us in our place, so to to speak, by showing that our planet is not at the center of universe; advances in biology are putting us further in our place by showing that our brains are as much a product of biology as any other part of our body, and by showing that our (human) brains are built by the very same processes as other creatures. Just as the earth is just one planet among many, from the perspective of the toolkit of developmental biology, our brain is just one more arrangement of molecules.


DIANE F. HALPERN
Professor of Psychology, Claremont McKenna College; Past-president (2005), the American Psychological Association; Author,
Thought and Knowledge

Choosing the sex of one's child

For an idea to be truly dangerous, it needs to have a strong and near universal appeal. The idea of being able to choose the sex of one's own baby is just such an idea.

Anyone who has a deep-seated and profound preference for a son or daughter knows that this preference may not be rational and that it may represent a prejudice better left unacknowledged about them. It is easy to dismiss the ability to decide the sex of one's baby as inconsequential. It is already medically feasible for a woman or couple to choose the sex of a baby that has not yet been conceived. There are a variety of safe methods available, such as Preimplanted Genetic Diagnosis (PGD), so-named because it was originally designed for couples with fertility problems, not for the purpose of selecting the sex of one's next child. With PGD, embryos are created in a Petri dish, tested for gender, and then implanted into the womb, so that the baby-to-be is already identified as female or male before implantation in the womb. The pro argument is simple: If the parents-to-be are adults, why not? People have always wanted to be able to choose the sex of their children. There are ancient records of medicine men and wizened women with various herbs and assorted advice about what to do to (usually) have a son. So, what should it matter if modern medicine can finally deliver what old wives' tales have promised for countless generations? Couples won't have to have a "wasted" child, such as a second child the same sex as the first one, when they really wanted "one of each." If a society has too many boys for a while, who cares? The shortage of females will make females more valuable and the market economy will even out in time. In the mean time, families will "balance out," each one the ideal composition as desired by the adults in the family.              

Every year for the last two decades I have asked students in my college classes to write down the number of children they would like to have and the order in which they ideally want to have girls and boys. I have taught in several different countries (e.g., Turkey, Russia, and Mexico) and types of universities, but despite large differences, the modal response is 2 children, first a boy, then a girl. If students reply that they want one child, it is most often a boy; if it is 3 children, they are most likely to want a boy, then a girl, then a boy. The students in my classes are not a random sample of the population: they are well educated and more likely to hold egalitarian attitudes than the general population. Yet, if they acted on their stated intentions, even they would have an excess of first-borns who are male, and an excess of males overall. In a short time, those personality characteristics associated with being either an only-child or first-born and those associated with being male would be so confounded, it would be difficult to separate them.            

The excess of males that would result from allowing every mother or couple to choose the sex of their next baby would not correct itself at the societal level because at the individual level, the preference for sons is stronger than the market forces of supply and demand. The evidence for this conclusion comes from many sources, including regions of the world where the ratio of young women to men is so low that it could only be caused by selective abortion and female infanticide (UNICEF and other sources). In some regions of rural China there are so few women that wives are imported from the Philippines and men move to far cities to find women to marry. In response, the Chinese government is now offering a variety of education and cash incentives to families with multiple daughters. There are still few daughters being born in these rural areas where prejudice against girls is stronger than government incentives and mandates. In India, the number of abortions of female fetuses has increased since sex-selective abortion was made illegal in 1994. The desire for sons is even stronger than the threat of legal action.            

In the United States, the data that show preferences for sons are more subtle than the disparate ratios of females and males found in other parts of the world, but the preference for sons is still strong. Because of space limitations, I list only a few of the many indicators that parents in the United States prefer sons: families with 2 daughters are more likely to have a third child than families with 2 sons, unmarried pregnant women who undergo ultrasound to determine the sex of the yet unborn child are less likely to be married at the time of the child's birth when the child is a girl than when it is a boy, and divorced women with a son are more likely to remarry than divorced women with a daughter.             

Perhaps the only ideas more dangerous that of choosing the sex of one's child would be trying to stop medical science from making advances that allow such choices or allowing the government to control the choices we can make as citizens. There are many important questions to ponder, including how to find creative ways to reduce or avoid negative consequences from even more dangerous alternatives. Consider, for example, what would our world be like if there were substantially more men than women? What if only the rich or only those who live in "rich countries" were able to choose the sex of their children? Is it likely that an approximately equal number of boys and girls would be or could be selected? If not, could a society or should a society make equal numbers of girls and boys a goal?            

I am guessing that many readers of child-bearing age want to choose the sex of their (as yet) unconceived children and can reason that there is no harm in this practice. And, if you could also choose intelligence, height, and hair color, would you add that too?  But then, there are few things in life that are as appealing as the possibility of a perfectly balanced family, which according to the modal response means an older son and younger daughter, looking just like an improved version of you.


THOMAS METZINGER
Frankfurt Institute for Advanced Studies; Johannes Gutenberg-Universität Mainz; President German Cognitive Science Society; Author: Being No One

The Forbidden Fruit Intuition

We all would like to believe that, ultimately, intellectual honesty is not only an expression of, but also good for your mental health. My dangerous question is if one can be intellectually honest about the issue of free will and preserve one's mental health at the same time. Behind this question lies what I call the "Forbidden Fruit Intuition": Is there a set of questions which are dangerous not on grounds of ideology or political correctness, but because the most obvious answers to them could ultimately make our conscious self-models disintegrate? Can one really believe in determinism without going insane?

For middle-sized objects at 37° like the human brain and the human body, determinism is obviously true. The next state of the physical universe is always determined by the previous state. And given a certain brain-state plus an environment you could never have acted otherwise — a surprisingly large majority of experts in the free-will debate today accept this obvious fact. Although your future is open, this probably also means that for every single future thought you will have and for every single decision you will make, it is true that it was determined by your previous brain state.

As a scientifically well-informed person you believe in this theory, you endorse it. As an open-minded person you find that you are also interested in modern philosophy of mind, and you might hear a story much like the following one. Yes, you are a physically determined system. But this is not a big problem, because, under certain conditions, we may still continue to say that you are "free": all that matters is that your actions are caused by the right kinds of brain processes and that they originate in you. A physically determined system can well be sensitive to reasons and to rational arguments, to moral considerations, to questions of value and ethics, as long as all of this is appropriately wired into its brain. You can be rational, and you can be moral, as long as your brain is physically determined in the right way. You like this basic idea: physical determinism is compatible with being a free agent. You endorse a materialist philosophy of freedom as well. An intellectually honest person open to empirical data, you simply believe that something along these lines must be true.

Now you try to feel that it is true. You try to consciously experience the fact that at any given moment of your life, you could not have acted otherwise. You try to experience the fact that even your thoughts, however rational and moral, are predetermined — by something unconscious, by something you can not see. And in doing so, you start fooling around with the conscious self-model Mother Nature evolved for you with so much care and precision over millions of years: You are scratching at the user-surface of your own brain, tweaking the mouse-pointer, introspectively trying to penetrate into the operating system, attempting to make the invisible visible. You are challenging the integrity of your phenomenal self by trying to integrate your new beliefs, the neuroscientific image of man, with your most intimate, inner way of experiencing yourself. How does it feel?

I think that the irritation and deep sense of resentment surrounding public debates on the freedom of the will actually has nothing much to do with the actual options on the table. It has to do with the — perfectly sensible — intuition that our presently obvious answer will not only be emotionally disturbing, but ultimately impossible to integrate into our conscious self-models.

Or our societies: The robust conscious experience of free will also is a social institution, because the attribution of accountability, responsibility, etc. are the decisive building blocks for modern, open societies. And the currently obvious answer might be interpreted by many as having clearly anti-democratic implications: Making a complex society work implies controlling the behavior of millions of people; if individual human beings can control their own behavior to a much lesser degree than we have thought in the past, if bottom-up doesn't work, then it becomes tempting to control it top-down, by the state. And this is the second way in which enlightenment could devour its own children. Yes, free will truly is a dangerous question, but for different reasons than most people think.


LYNN MARGULIS
Biologist, University of Massachusetts, Amherst; Coauthor (with Dorion Sagan),
Acquiring Genomes: A Theory of the Origins of Species


Bacteria are us

What is my dangerous idea? Although arcane, evidence for this dangerous concept is overwhelming; I have collected clues from many sources. Reminiscent of Oscar Wilde's claim that "even true things can be proved" I predict that the scientific gatekeepers in academia eventually will be forced to permit this dangerous idea to become widely accepted. What is it?

Our sensibilities, our perceptions that register through our sense organ cells evolved directly from our bacterial ancestors. Signals in the environment: light impinging on the eye's retina, taste on the buds of the tongue, odor through the nose, sound in the ear are translated to nervous impulses by extensions of sensory cells called cilia. We, like all other mammals, including our apish brothers, have taste-bud cilia, inner ear cilia, nasal passage cilia that detect odors. We distinguish savory from sweet, birdsong from whalesong, drumbeats from thunder. With our eyes closed, we detect the light of the rising sun and and feel the vibrations of the drums. These abilities to sense our surroundings, a heritage that preceded the evolution of all primates, indeed, all animals, by use of specialized cilia at the tips of sensory cells, and the existence of the cilia in the tails of sperm, come from one kind of our bacterial ancestors. Which? Those of our bacterial ancestors that became cilia. We owe our sensitivity to a loving touch, the scent of lavender , the taste of a salted nut or vinaigrette, a police-cruiser siren, or glimpse of brilliant starlight to our sensory cells. We owe the chemical attraction of the sperm as its tail impels it to swim toward the egg, even the moss plant sperm, to its cilia. The dangerous idea is that the cilia evolved from hyperactive bacteria. Bacterial ancestors swam toward food and away from noxious gases, they moved up to the well-lit waters at the surface of the pond. They were startled when, in a crowd, some relative bumped them. These bacterial ancestors that never slept, avoided water too hot or too salty. They still do.

Why is the concept that our sensitivities evolved directly from swimming bacterial ancestors of the sensory cilia so dangerous?

Several reasons: we would be forced to admit that bacteria are conscious, that they are sensitive to stimuli in their environment and behave accordingly. We would have to accept that bacteria, touted to be our enemies, are not merely neutral or friendly but that they are us. They are direct ancestors of our most sensitive body parts. Our culture's terminology about bacteria is that of warfare: they are germs to be destroyed and forever vanquished, bacterial enemies make toxins that poison us. We load our soaps with antibacterials that kill on contact, stomach ulcers are now agreed to be caused by bacterial infection. Even if some admit the existence of "good" bacteria in soil or probiotic food like yogurt few of us tolerate the dangerous notion that human sperm tails and sensitive cells of nasal passages lined with waving cilia, are former bacteria. If this dangerous idea becomes widespread it follows that we humans must agree that even before our evolution as animals we have hated and tried to kill our own ancestors. Again, we have seen the enemy, indeed, and, as usual, it is us. Social interactions of sensitive bacteria, then, not God, made us who were are today.



< previous

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 |10 | 11 | 12

next >

John Brockman, Editor and Publisher
Russell Weinberger, Associate Publisher

contact: [email protected]
Copyright © 2006 by
Edge Foundation, Inc
All Rights Reserved.

|Top|