THE WORLD QUESTION CENTER 2001

What Questions Have Disappeared?

Printer version

Judy Harris

"Do genes influence human behavior?"

This question bit the dust after a brief but busy life; it is entirely a second-half-of the-20th-century question. Had it been asked before the 20th century, it would have been phrased differently: "heredity" instead of "genes." But it wasn't asked back then, because the answer was obvious to everyone. Unfortunately, the answer everyone gave — yes! — was based on erroneous reasoning about ambiguous evidence: the difference in behavior between the pauper and the prince was attributed entirely to heredity. The fact that the two had been reared in very different circumstances, and hence had had very different experiences, was overlooked.

Around the middle of the 20th century, it became politically incorrect and academically unpopular to use the word "heredity"; if the topic came up at all, a euphemism, "nature," was used in its place. The fact that the pauper and the prince had been reared in very different circumstances now came to the fore, and the behavioral differences between them was now attributed entirely to the differences in their experiences. The observation that the prince had many of the same quirks as the king was now blamed entirely on his upbringing. Unfortunately, this answer, too, was based on erroneous reasoning about ambiguous evidence.

That children tend to resemble their biological parents is ambiguous evidence; the fact that such evidence is plentiful — agreeable parents tend to have agreeable kids, aggressive parents tend to have aggressive kids, and so on — does not make it any less ambiguous. The problem is that most kids are reared by their biological parents. The parents have provided both the genes and the home environment, so the kids' heredity and environment are correlated. The prince has inherited not only his father's genes but also his father's palace, his father's footmen, and his father's Lord High Executioner (no reference to living political figures is intended).

To disambiguate the evidence, special techniques are required — ways of teasing apart heredity and environment by controlling the one and varying the other. Such techniques didn't begin to be widely used until the 1970s; their results didn't become widely known and widely accepted until the 1990s. By then so much evidence had piled up that the conclusion (which should have been obvious all along) was incontrovertible: yes, genes do influence human behavior, and so do the experiences children have while growing up.

(I should point out, in response to David Deutsch's contribution to the World Question Center, that no one study, and no one method, can provide an answer to a question of this sort. In the case of genetic influences on behavior, we have converging evidence — studies using a variety of methods all led to the same conclusion and even agreed pretty well on the quantitative details.)

Though the question has been answered, it has left behind a cloud of confusion that might not disappear for some time. The biases of the second half of the 20th century persist: when "dysfunctional" parents are found to have dysfunctional kids, the tendency is still to blame the environment provided by the parents and to overlook the fact that the parents also provided the genes.

Some would argue that this bias makes sense. After all, they say, we know how the environment influences behavior. How the genes influence behavior is still a mystery — a question for the 21st century to solve. But they are wrong. They know much less than they think they know about how the environment influences behavior.

The 21st century has two important questions to answer. How do genes influence human behavior? How is human behavior influenced by the experiences a child has while growing up?

JUDITH RICH HARRIS is a writer and developmental psychologist; co-author of The Child: A Contemporary View Of Development; winner of the 1997 George A. Miller Award for an outstanding article in general psychology, and author of The Nurture Assumption: Why Children Turn Out The Way They Do.


Philip W. Anderson

"A question no longer: what is the Theory of Every Thing?"

My colleagues in the fashionable fields of string theory and quantum gravity advertise themselves as searching desperately for the 'Theory of Everything", while their experimental colleagues are gravid with the "God Particle", the marvelous Higgson which is the somewhat misattributed source of all mass. (They are also after an understanding of the earliest few microseconds of the Big Bang.) As Bill Clinton might remark, it depends on what the meaning of "everything" is. To these savants, "everything" means a list of some two dozen numbers which are the parameters of the Standard Model. This is a set of equations which already exists and does describe very well what you and I would be willing to settle for as "everything". This is why, following Bob Laughlin, I make the distinction between "everything" and "every thing". Every thing that you and I have encountered in our real lives, or are likely to interact with in the future, is no longer outside of the realm of a physics which is transparent to us: relativity, special and general; electromagnetism; the quantum theory of ordinary, usually condensed, matter; and, for a few remote phenomena, hopefully rare here on earth, our almost equally cut-and-dried understanding of nuclear physics. [Two parenthetic remarks: 1) I don't mention statistical mechanics only because it is a powerful technique, not a body of facts; 2) our colleagues have done only a sloppy job so far of deriving nuclear physics from the Standard Model, but no one really doubts that they can.]

I am not arguing that the search for the meaning of those two dozen parameters isn't exciting, interesting, and worthwhile: yes, it's not boring to wonder why the electron is so much lighter than the proton, or why the proton is stable at least for another 35 powers of ten years, or whether quintessence exists. But learning why can have no real effect on our lives, spiritually inspiring as it would indeed be, even to a hardened old atheist like myself.

When I was learning physics, half a century ago, the motivation for much of what was being done was still "is quantum theory really right?" Not just QED, though the solution of that was important, but there were still great mysteries in the behavior of ordinary matter--like superconductivity, for instance. It was only some twenty years later that I woke up to the fact that the battle had been won, probably long before, and that my motivation was no longer to test the underlying equations and ideas, but to understand what is going on. Within the same few years , the molecular biology pioneers convinced us we needed no mysterious "life force" to bring all of life under the same umbrella. Revolutions in geology, in astrophysics, and the remarkable success of the Standard Model in sorting out the fundamental forces and fields, leave us in the enviable position I described above: given any problematic phenomenon, we know where to start, at least. And nothing uncovered in string theory or quantum gravity will make any difference to that starting point.

Is this Horgan's End of Science? Absolutely not. It's just that the most exciting frontier of science no longer lies at the somewhat sophomoric — or quasi-religious — level of the most "fundamental" questions of "what are we made of?" and the like; what needs to be asked is "how did all this delightful complexity arise from the stark simplicity of the fundamental theory?" We have the theory of every thing in any field of science you care to name, and that's about as far as it gets us. If you like, science is now almost universally at the "software" level; the fundamental physicists have given us all the hardware we need, but that doesn't solve the problem, in physics as in every other field. It's a different game, probably a much harder one in fact, as it has often been in the past; but the game is only begun.

PHILIP W. ANDERSON is a Nobel laureate physicist at Princeton and one of the leading theorists on superconductivity. He is the author of A Career in Theoretical Physics, and Economy as a Complex Evolving System.


Dan Sperber

"Are women and men equal?"


No doubt, there are differences between women and men, some obvious and others more contentious. But arguments for inequality of worth or rights between the sexes have wholly lost intellectual respectability. Why? Because they were grounded in biologically evolved dispositions and culturally transmitted prejudices that, however strongly entrenched, could not withstand the kind of rational scrutiny to which they have been submitted in the past two centuries. Also because, more recently, the Feminist movement has given so many of us the motivation and the means to look into ourselves and recognize and fight lingering biases. Still, the battle against sexism is not over — and it may never be.

DAN SPERBER is a social and cognitive scientist at the French Centre National de la Recherche Scientifique (CNRS) in Paris. His books include Rethinking Symbolism, On Anthropological Knowledge, Explaining Culture: A Naturalistic Approach, and, with Deirdre Wilson, Relevance: Communication and Cognition.


Michael Shermer

"Can science answer moral and ethical questions?"

From the time of the Enlightenment philosophers have speculated that the remarkable advances of science would one day spill over into the realm of moral philosophy, and that scientists would be able to discover answers to previously insoluble moral dilemmas and ethical conundrums. One of the reasons Ed Wilson's book Consilience was so successful was that he attempted to revive this Enlightenment dream. Alas, we seem no closer than we were when Voltaire, Diderot, and company first encouraged scientists to go after moral and ethical questions. Are such matters truly insoluble and thus out of the realm of science (since, as Peter Medewar noted, "science is the art of the soluble")? Should we abandon Ed Wilson's Enlightenment dream of applying evolutionary biology to the moral realm? Most scientists agree that moral questions are scientifically insoluble and they have abandoned the Enlightenment dream. But not all. We shall see.

MICHAEL SHERMER is the founding publisher of Skeptic magazine, the host of the acclaimed public science lecture series at Caltech, and a monthly columnist for Scientific American. His books include Why People Believe Weird Things, How We Believe, and Denying History.


Lee Smolin

"What is the next step in the evolution of democracy?"

A question no longer being asked is how to make the next step in the evolution of a democratic society. Until very recently it was widely understood that democracy was a project with many steps, whose goal was the eventual construction of a perfectly just and egalitarian society. But recently, with the well deserved collapse of Marxism, it has begun to seem that the highest stage of civilization we humans can aspire to is global capitalism leavened by some version of a bureaucratic welfare state, all governed badly by an unwieldy and corrupt representative democracy. This is better than many of the alternatives, but it is hardly egalitarian and often unjust; those of us who care about these values must hope that human ingenuity is up to the task of inventing something still better.

It is proper that the nineteenth century idea of utopia has finally been put to rest, for that was based on a paradox, which is that any predetermined blueprint for an ideal society could only be imposed by force. It is now almost universally acknowledged that there is no workable alternative to the democratic ideal that governments get their authority by winning the consent of the governed. This means that if we are to change society, it must be by a process of evolution rather than revolution. But why should this mean that big changes are impossible? What is missing are new ideas, and a context to debate them.

There are at least four issues facing the future of the democratic project. First, while democracy in the worlds most powerful country is perceived by many of its citizens as corrupted, there is little prospect for serious reform. The result is alienation so severe that around half of our citizens do not participate in politics. At what point, we may ask, will so few vote that the government of the United States may cease to have a valid claim to have won the consent of the governed. As the political and journalistic classes have largely lost the trust of the population, where will leadership to begin the reform that is so obviously needed come from?

A second point of crisis and opportunity is in the newly democratized states. In many of these countries intellectuals played a major role in the recent establishment of democracy. These people are not likely to go to sleep and let the World Bank tell them what democracy is.

The third opportunity is in Europe, where a rather successful integration of capitalism and democratic socialism has been achieved. These societies suffer much less from poverty and the other social and economic ills that appear so unsolvable in the US context. (And it is not coincidental that the major means of funding political campaigns in the US are illegal in most of Europe.) Walking the streets in Denmark or Holland it is possible to wonder what a democratic society that evolved beyond social democracy might look like. European integration may be only the first step towards a new kind of nation state which will give much of its sovereignty up to multinational entities, a kind of nation-as-local-government.

Another challenge for democracy is the spread of the bureaucratic mode of organization, which in most countries has taken over the administration of education, science, health and other vital areas of public interest. As any one who works for a modern university or hospital can attest to, bureaucratic organizations are inherently undemocratic. Debate amongst knowledgeable, responsible individuals is replaced by the management of perceptions and the manipulation of supposedly objective indices. As the politics of the academy begins to look more like nineteenth century Russia than 5th Century BC Athens we intellectuals need to do some serious work to invent more democratic modes of organization for ourselves and for others who work in the public interest.

Is it not then time we "third culture intellectuals" begin to attack the problem of democracy, in both our workplaces and in our societies? Perhaps, with all of our independence, creativity, intelligence and edginess, we may find we really have something of value to contribute?

LEE SMOLIN is a theoretical physicist; professor of physics and member of the Center for Gravitational Physics and Geometry at Pennsylvania State University; author of The Life of The Cosmos.


Rodney A. Brooks

"What is it that makes something alive?"

With the success of molecular biology explaining the mechanisms of life we have lost sight of the question one level up. We do not have any good answers at a more systems level of what it takes for something to be alive. We can list general necessities for a system to be alive, but we can not predict whether a given configuration of molecules will be alive or not. As evidence that we really do not understand what it takes for something to be alive, we have not been able to build machines that are alive.

Everything else that we understand leads to machines that capitalize on that understanding — machines that fly, machines that run, machines that calculate, machines that make polymers, machines that communicate, machines that listen, machines that play games. We have not built any machines that live.

RODNEY A. BROOKS is director of the MIT Artificial Intelligence Laboratory and Chairman of iRobot Corporation. He builds robots.


Roger Schank

"Why Teach Mathematics?"

Some questions are so rarely asked that we are astonished anyone would ask them at all. The entire world seems to agree that knowing mathematics is the key to something important, they just forget what. Benjamin Franklin asked this question in 1749 while thinking about what American schools should be like and concluded that only practical mathematics should be taught. The famous mathematician G.H. Hardy asked this question (A Mathematicians's Apology) and concluded that while he loved the beauty of mathematics there was no real point teaching it to children.

Today, we worry about the Koreans and Lithuanians doing better than us in math tests and every "education president" asserts that we will raise math scores, but no one asks why this matters. Vague utterances about how math teaches reasoning belie the fact that mathematicicans do everyday reasoning particularly better than anyone else. To anyone who reads this and still is skeptical, I ask: what is the Quadratic Formula? You learned it in ninth grade, you couldn't graduate high school without it. When was the last time you used it? What was the point of learning it?

ROGER SCHANK is the Chairman and Chief Technology Officer for Cognitive Arts and has been the Director of the Institute for the Learning Sciences at Northwestern University since its founding in 1989. One of the world's leading Artificial Intelligence researchers, he is books include: Dynamic Memory: A Theory of Learning in Computers and People , Tell Me a Story: A New Look at Real and Artificial Memory, The Connoisseur's Guide to the Mind, and Engines for Education.


John Horgan

"Is enlightenment a myth or a reality?"

Is enlightenment a myth or a reality? I mean the enlightenment of the east, not west, the state of supreme mystical awareness also known as nirvana, satori, cosmic consciousnesss, awakening. Enlightenment is the telos of the great Eastern religions, Buddhism and Hinduism, and it crops up occasionally in western religions, too, although in a more marginal fashion. Enlightenment once preoccupied such prominent western intellectuals as William James, Aldous Huxley and Joseph Campbell, and there was a surge of scientific interest in mysticism in the 1960’s and 1970’s. Then mysticism became tainted by its association with the human potential and New Age movements and the psychedelic counterculture, and for the last few decades it has for the most part been banished from serious scientific and intellectual discourse. Recently a few scholars have written excellent books that examine mysticism in the light of modern psychology and neuroscience — Zen and the Brain by the neurologist James Austin; Mysticism, Mind, Consciousness by the philosopher Robert Forman; The Mystical Mind by the late psychiatrist Eugene d'Aquili and the radiologist Andrew Newberg — but their work has received scant attention in the scientific mainstream. My impression is that many scientists are privately fascinated by mysticism but fear being branded as fuzzy-headed by disclosing their interest. If more scientists revealed their interest in mystical consciousness, perhaps it could become a legitimate subject for investigation once again.

JOHN HORGAN is a freelance writer and author of The End of Science and The Undiscovered Mind. A senior writer at Scientific American from 1986 to 1997, he has also written for the New York Times, Washington Post, New Republic, Slate, London Times, Times Literary Supplement and other publications.


Robert Aunger

"Is the Central Dogma of biology inviolate?"

In 1957, a few years after he co-discovered the double helix, Francis Crick proposed a very famous hypothesis. It states that "once 'information' has passed into protein it cannot get out again. In more detail, the transfer of information from nucleic acid to nucleic acid, or from nucleic acid to protein may be possible, but transfer from protein to protein, or from protein to nucleic acid is impossible." After it had proven to form the foundation of molecular biology, he later called this hypothesis the "Central Dogma" of biology.

In the last years of the last millennium, Crick's dogma fell. The reason? Direct protein-to-protein information transfer was found to be possible in a class of proteins called "prions." With the aid of a catalyst, prions (short for "proteinaceous infectious particles") cause another molecule of the same class to adopt an infectious shape like their own simply through contact. Thus, prions are an important and only recently discovered mechanism for the inheritance of information through means other than DNA. Such an important discovery merited a recent Nobel Prize for Stanley Prusiner, who doggedly pursued the possibility of a rogue biological entity replicating without the assistance of genes against a back-drop of resistance and disbelief among most of his colleagues. Further testimony to the significance of prions comes from the current BSE crisis in Europe. Now that we know how they work, prions — and the diseases they cause — may begin popping up all over the place.

ROBERT AUNGER is an anthropologist studying cultural evolution, both through the now much-maligned method of fieldwork in nonwestern societies, and the application of theory adapted from evolutionary biology. He is at the Department of Biological Anthropology at the University of Cambridge, and the editor of Darwinizing Culture: The Status of Memetics as a Science.


David G. Post

"... can there really be fossil sea-shells in the mountains of Kentucky, hundreds of miles from the Atlantic coast? "

This question about questions may be a useful way to differentiate "science" from "not-science"; questions really do disappear in the former in a way, or at least at a rate, that they don't in the latter.

A question that has disappeared: can there really be fossil sea-shells in the mountains of Kentucky, hundreds of miles from the Atlantic coast?

I came across this particular question recently when reading Thomas Jefferson's 'Notes on the State of Virginia'; he devotes several pages to speculation about whether the finds in Kentucky really were sea-shells, and, if so, how they could have ended up there. Geologists could, today, tell him.

"...from what source do governments get their legitimate power?"

Perhaps another question dear to Jefferson's heart has also disappeared: from what source do governments get their legitimate power? In 1780, this was a real question, concerning which reasonable people gave different answers: 'God,' or 'the divine right of Kings,' or 'heredity,' or 'the need to protect its citizens.' By declaring as 'self evident' the 'truth' that 'governments derive their just power from the consent of the governed,' Jefferson was trying to declare that this question had, in fact, disappeared. I think he may have been right.

DAVID POST is Professor of Law at Temple University, and Senior Fellow at The Tech Center at George Mason University, with an interest in questions of (and inter-connections between) Internet law, complexity theory, and the ideas of Thomas Jefferson.


David G. Myers

"Does money buy happiness?"

Three in four entering collegians today deem it "very important" or "essential" that they become "very well-off financially." Most adults believe "more money" would boost their quality of life. And today's "luxury fever" suggests that affluent Americans and Europeans are putting their money where their hearts are. "Whoever said money can't buy happiness isn't spending it right," proclaimed a Lexus ad. But the facts of life have revealed otherwise.

Although poverty and powerlessness often bode ill for body and spirit, wealth fails to elevate well-being. Surveys reveal that even lottery winners and the super rich soon adapt to their affluence. Moreover, those who strive most for wealth tend, ironically, to live with lower well-being than those focused on intimacy and communal bonds. And consider post-1960 American history: Average real income has doubled, so we own twice the cars per person, eat out two and a half times as often, and live and work in air conditioned spaces. Yet, paradoxically, we are a bit less likely to say we're "very happy." We are more often seriously depressed. And we are just now, thankfully, beginning to pull out of a serious social recession that was marked by doubled divorce, tripled teen suicide, quadrupled juvenile violence, quintupled prison population, and a sextupled proportion of babies born to unmarried parents. The bottom line: Economic growth has not improved psychological morale or communal health.

DAVID G. MYERS is a social psychologist at Hope College (Michigan) and author, most recently, of The American Paradox: Spiritual Hunger in an Age of Plenty and of A Quiet World: Living with Hearing Loss.


William H. Calvin

"Where did the moon go?"

When, every few years, you see a bite taken out of the sun or moon, you ought to remember just how frightening that question used to be. It became clockwork when the right viewpoint was eventually discovered by science (imagining yourself high above the north pole, looking at the shadows cast by the earth and the moon). But there was an intermediate stage of empirical knowledge, when the shaman discovered that the sixth full moon after a prior eclipse had a two-third's chance of being associated with another eclipse. And so when the shaman told people to pray hard the night before, he was soon seen as being on speaking terms with whomever ran the heavens. This helped convert part-time shamen into full-time priests, supported by the community. This can be seen as the entry-level job for philosophers and scientists, who prize the discoveries they can pass on to the next generation, allowing us to see farther, always opening up new questions while retiring old ones. It's like climbing a mountain that keeps providing an even better viewpoint.

WILLIAM H. CALVIN is a neurobiologist at the University of Washington, who writes about brains, evolution, and climate. His recent books are The Cerebral Code, How Brains Think, and (with the linguist Derek Bickerton) Lingua ex Machina.


Carl Zimmer

"When will disease be eradicated?"

By the middle of the twentieth century, scientists and doctors were sure that it was just a matter of time, and not much time at that, before most diseases would be wiped from the face of the Earth. Antibiotics would get rid of bacterial infections; vaccines would get rid of viruses; DDT would get rid of malaria. Now one drug after the next are becoming useless against resistant parasites, and new plagues such as AIDS are sweeping through our species. Except for a handful of diseases like smallpox and Guinea worms, eradication now looks like a fantasy. There are three primary reasons that this question is no longer asked. First, parasites evolution is far faster and more sophisticated than anyone previously appreciated. Second, scientists don't understand the complexities of the immune system well enough to design effective vaccines for many diseases yet. For another, the cures that have been discovered are often useless because the global public health system is a mess. The arrogant dream of eradication has been replaced by much more modest goals of trying to keep diseases in check.

CARL ZIMMER is the author of Parasite Rex and writes a column about evolution for Natural History.


David Deutsch

"And why?"

"What Questions Have Disappeared...And Why?" Funny you should ask that. "And why? " could itself be the most important question that has disappeared from many fields.

"And why?": in other words, "what is the explanation for what we see happening?" "What is it in reality that brings about the outcome that we predict?" Whenever we fail to take that question seriously enough, we are blinded to gaps in our favoured explanation. And so, when we use that explanation to interpret regularities that we may observe, instead of understanding that the explanation was an assumption in our analysis, we regard it as the inescapable implication of our observations.

"I just can't feel myself split", complained Bryce DeWitt when he first encountered the many-universes interpretation of quantum theory. Then Hugh Everett convinced him that this was the same circular reasoning that Galileo rejected when he explained how the Earth can be in motion even though we observe it to be at rest. The point is, both theories are consistent with that observation. Thanks to Everett, DeWitt and others, the "and why" question began gradually to return to quantum theory, whence it had largely disappeared during the 1930s. I believe that its absence did great harm both in impeding progress and in encouraging all sorts of mystical fads and pseudo-science. But elsewhere, especially in the human philosophies (generally known as social sciences), it is still largely missing. Although behaviourism — the principled refusal to ask "and why?" — is no longer dominant as an explicit ideology, it is still widespread as a psychological attitude in the human philosophies.

Suppose you identified a gene G, and a human behaviour B, and you undertook a study with 1000 randomly chosen people, and the result was that of the 500 people who had G in their genome, 499 did B, while of the 500 who lacked G, 499 failed to do B. You'd conclude, wouldn't you, that G is the predominant cause of B? Obviously there must be other mechanisms involved, but they have little influence on whether a person does B or not. You'd inform the press that all those once-trendy theories that tried to explain B through people's upbringing or culture, or attributed it to the exercise of free will or the logic of the situation or any combination of such factors — were just wrong. You've proved that when people choose to do B, they are at the very least responding to a powerful influence from their genes. And if someone points out that your results are perfectly consistent with B being 100% caused by something other than G (or any other gene), or with G exerting an influence in the direction of not doing B, you will shrug momentarily, and then forget that possibility. Won't you?

DAVID DEUTSCH's research in quantum physics has been influential and highly acclaimed. His papers on quantum computation laid the foundations for that field, breaking new ground in the theory of computation as well as physics, and have triggered an explosion of research efforts worldwide. He is a member of the Centre for Quantum Computation at the Clarendon Laboratory, Oxford University and the author of The Fabric of Reality.


Timothy Taylor

"How can I stop the soul of the deceased reanimating the body?"

At a particular point (yet to be clearly defined) in human cultural evolution, a specific idea took hold that there were two, partially separable, elements present in a living creature: the material body and the force that animated it. On the death of the body the animating force would, naturally, desire the continuation of this-worldly action and struggle to reassert itself (just as one might strive to retrieve a flint axe one had accidentally dropped). If the soul (or spirit) succeeded, it would also seek to repossess its property, including its spouse, and reassert its material appetites.

The desire of the disembodied soul was viewed as dangerous by the living, who had by all means to enchant, cajole, fight off, sedate, or otherwise distract and disable it. This requirement to keep the soul from the body after death did not last forever, only so long as the flesh lay on the bones. For the progress of the body's decomposition was seen as analogous to the slow progress the soul made toward the threshold of the Otherworld. When the bones where white (or were sent up in smoke or whatever the rite in that community was), then it was deemed that the person had finally left this life and was no longer a danger to the living. Thus it was, that for most of recent human history (roughly the last 35,000 years) funerary rites were twofold: the primary rites zoned off the freshly dead and instantiated the delicate ritual powers designed to keep the unquiet soul at bay; the secondary rites, occurring after weeks or months (or, sometimes — in the case of people who had wielded tremendous worldly power — years), firmly and finally incorporated the deceased into the realm of the ancestors.

Since the rise of science and scepticism, the idea of the danger of the disembodied soul has, for an increasing number of communities, simply evaporated. But there is a law of conservation of questions. "How can I stop the soul of the deceased reanimating the body?" is now being replaced with "How can I live so long that my life becomes indefinite?," a question previously only asked by the most arrogant pharaohs and emperors.

TIMOTHY TAYLOR lecturers in the Department of Archaeological Sciences, University of Bradford, UK. He is the author of The Prehistory of Sex.


Dan McNeill

Where is the Great American Novel?

This question haunted serious writers in the early 20th century, when critics sought a product that measured up to the European standard. Now it is dead, and the underlying notion is in ICU. What happened?

Well, the idea itself was never a very good one. It had breathtakingly hazy contours. It ignored the work of authors like Melville, Hawthorne, Wharton, and Twain. And it seemed to assume that a single novel could sum up this vast and complex nation. I'd like to think its disappearance reflects these problems.

But technology also helped shelve the question. As media proliferated, literature grew less central. If the Great American Novel appeared tomorrow, how many people would actually read it? My guess: Most would wait for the movie.

DANIEL McNEILL is the author of The Face, and principal author of the best-selling Fuzzy Logic, which won the Los Angeles Times Book Prize in Science and Technology, and was a New York Times "Notable Book of the Year".


| 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 |

John Brockman, Editor and Publisher

Copyright © 2001 by Edge Foundation, Inc
All Rights Reserved.

| Top |