will life be like after the revolution?"
The disappearance of this question isn't only a trace of the deletion
of the left. It is also a measure of our loss of faith in secular
redemption. We don't look forward anymore to radical transformation.
Perhaps it's a result of a century of disappointments: from the
revolution of 1917 to Stalin and the fall of communism; from the
Spanish Civil War to Franco; from Mao's long march to Deng's proclamation
that to get rich is glorious. Perhaps it's a result of political
history. But there was more that had to do with psychological
transformation. Remember Norman O. Brown's essay, "The place of
apocalypse in the life of the mind"? Remember R. D. Laing's turn
on breakdown as breakthrough? Remember the fascination with words
like 'metamorphosis' and 'metanoia'?
Maybe we're just getting older and all too used to being the people
we are. But I'd like to think we're getting wiser and less naive
about the possibility of shedding our pasts overnight.
It's important to distinguish between political liberalism on
the one hand and a faith in discontinuous transformation on the
other. If we fail to make this distinction, then forgetting about
the revolution turns (metanoically) into the familiar swing to
the right. Old radicals turn reactionary. If we're less dramatic
about our beliefs, if we're more cautious about distinguishing
between revolutionary politics and evolutionary psychology, then
we'll retain our faith in the dream that we can do better.
Just not overnight.
p.s. Part of the passion for paradigms and thier shiftings may
derive from displaced revolutionary fervor. If you yearn for transfiguration,
but can't find it in religion or politics, then you'll seek it
elsewhere, like the history of science.
p.p.s. There is one place where talk of transformation
is alive and kicking, if not well: The executive suite. The business
press is full of books about corporate transformation, re-engineering
from a blank sheet of paper, reinvention from scratch. Yes, corporate
America is feeling the influence of the sixties as boomers reach
thte board room. And this is not a bad thing. For, just as the
wisdom to distinguish between revolutionary politics and evolutionary
psychology can help us keep the faith in marginal improvements
in the human condition, so the tension between greying warriors
for change and youthful stalwarts of the status quo will keep
us from lurching left or right.
JAMES OGILVY is co-founder and managing director of Global Business
Network; taught philosophy at Yale and Williams; served as director
of research for the Values and Lifestyles Program at SRI International;
author of Many Dimensional Man, and Living without a
Fermat's question, 'is it true that there are no integers x, y,
z and n, all greater than 2, such that x^n + y^n = z^n?', F? for
short, raised in the 17th century, disappear when Andrew Wiles
answered it affirmatively by a proof of Fermat's theorem F in
Did Fermat's question, "is it true that there are no integers
x, y, z and n, all greater than 2, such that x^n + y^n = z^n?",
F? for short, raised in the 17th century, disappear when Andrew
Wiles answered it affirmatively by a proof of Fermat's theorem
F in 1995?
answer is no.
The question F? can be explained to every child, but the proof
of F is extremely sophisticated requiring techniques and results
way beyond the reach of elementary arithmetic, thus raising the
quest for conceptually simpler proofs. What is going on here,
why do such elementary theorems require such intricate machinery
for their proof? The fact of the truth of F itself is hardly of
vital interest. But, in the wake of Goedel's incompleteness proof
of 1931, F? finds it place in a sequence of elementary number
theoretic questions for which there provably cannot exist any
algorithmic proof procedure!
Or take the question D? raised by the gut feeling that there are
more points on a straight line segment than there are integers
in the infinite sequence 1,2,3,4,.... Before it can be answered
the question what is meant by "more" must be dealt with. This
done by the 18th Century's progress in the Foundations, D? became
amenable to Cantor's diagonal argument, establishing theorem D.
But this was by no means the end of the question!
The proof gave rise to new fields of investigation and new ideas.
In particular, the Continuum hypothesis C?, a direct descendant
of D? was shown to be "independent" of the accepted formal system
of set theory. A whole new realm of questions sprang up; questions
X? that are answered by proofs of independence, bluntly by: "that
depends" on what you are talking about, what system you
are using, on your definition of the word "is" and so forth. With
this they give rise to comparative studies of systems without
as well as with the assumption X added. Euclid's parallel axiom
in geometry, is the most popular early example.
What about the question as to the nature of infinitesimal's, a
question that has plagued us ever since Leibniz. Euler and his
colleagues had used them with remarkable success boldly following
their intuition. But in the 18th Century mathematicians became
self conscious. By the time we were teaching our calculus classes
by means of epsilon's, delta's and Dedekind cuts some of us might
have thought that Cauchy, Weierstrass and Dedekind had chased
the question away. But then along came logicians like Abraham
Robinson with a new take on it with so-called non standard quantities
another favorite of the popular science press.
Finally, turning to a controversial issue; the question of the
existence of God can neither be dismissed by a rational "No" nor
by a politically expedient "Yes". Actually as a plain yes-or-no
question it ought to have disappeared long ago. Nietzsche, in
particular, did his very best over a hundred years ago to make
it go away. But the concept of God persists and keeps a maze of
questions afloat, such as "who means what by Him"?, "do we need
a boogie man to keep us in line"?, "do we need a crutch to hold
despair at bay"? and so forth, all questions concerning human
Good questions do not disappear, they mature, mutate and spawn
VERENA HUBER-DYSON, is a mathematician who taught at UC Berkeley
in the early sixties, then at the U of Illinois' at Chicago Circle,
before retiring from the University of Calgary. Her research papers
on the interface between Logic and Algebra concern decision problems
in group theory. Her monograph Goedel's theorem: a workbook
on formalization is an attempt at a self contained interdisciplinary
introduction to logic and the foundations of mathematics.
we survive death?"
question was long considered metaphysical, briefly became a scientific
question, and has now disappeared again. Victorian intellectuals
such as Frederic Myers, Henry Sidgwick and Edmund Gurney founded
the Society for Psychical Research in 1882 partly because they
realised that the dramatic claims of spiritualist mediums could
be empirically tested. They hoped to prove "survival"
and thus overturn the growing materialism of the day. Some,
like Faraday, convinced themselves by experiment that the claims
were false, and lost interest. Others, like Myers, devoted their
entire lives to ultimately inconclusive research. The Society
continues to this day, but survival research has all but ceased.
suggest that no one asks the question any more because the answer
seems too obvious. To most scientists it is obviously "No",
while to most New Agers and religious people it is obviously "Yes".
But perhaps we should. The answer may be obvious (it's "No"
I'm an unreligious scientist) but its implications for
living our lives and dealing compassionately with other people
BLACKMORE is a psychologist and ex-parapsychologist, who
when she found no evidence of psychic phenomena turned
her attention to why people believe in them. She is author of
several skeptical books on the paranormal and, more recently,
The Meme Machine.
can't girls/women do math?"
I take a couple of days off from reading email over Christmas
and when I next log on already there are over twenty responses
to the Edge question! Maybe the question we should
all be asking is "Doesn't anyone take time off any more?"
to questions that have disappeared, as a mathematician I hope
we've seen the last of the question "Why can't girls/women
do math?" With women now outnumbering men in mathematics
programs in most US colleges and universities, that old wives'
tale (old husbands' tale?) has surely been consigned to the garbage
can. Some recent research at Brown University confirmed what most
of us had long suspected: that past (and any remaining present)
performance differences were based on cultural stereotyping. (The
researchers found that women students performed worse at math
tests when they were given in a mixed gender class than when no
men were present. No communication was necessary to cause the
difference. The sheer presence of men was enough.)
I was enjoying my offline Christmas, Roger Schank already raised
the other big math question: Why do we make such a big deal of
math performance and of teaching math to everyone in the first
place? But with the educational math wars still raging, I doubt
seen the last of that one!
DEVLIN is a mathematician, writer, and broadcaster living in California.
His latest book is The Math Gene: How Mathematical Thinking
Evolved and Why Numbers Are Like Gossip.
does all the information mean?"
The ubiquity of upscale coffee houses has eliminated the need
to ask "Where can I get a cup of coffee?" But I suspect that the
question "What questions have disappeared?" is meant to elicit
even deeper and [perhaps] more meaningful responses.
The coffee house glut has been accompanied although with
no necessarily causal link by an avalanche of information
[or, at least, of data] and in the rush to obtain [or "to access"
groan] that information weve stopped asking "What
does it all mean?" It is as though raw data, in and of itself,
has real value, indeed all of the value, and thus there is no
need to stop, to assimilate, to ponder. We grasp for faster computers,
greater bandwidth, non-stop connectivity. We put computers in
every classroom, rewire schools. But, with the exception of a
great deal of concern about the business and marketing uses of
the new "information age," we pay precious little attention to
how the information can be used to change, or improve, our lives,
nor do we seem to take the time to slow down and deliberate upon
We wire the schools, but never ask what all those computers in
classrooms will be used for, or whether teachers know what to
do with them, or whether we can devise ways to employ the technology
to help people learn in new or better ways. We get cable modems,
or high-speed telephone lines, but dont think about what
we can do with them beyond getting more information faster. [Really,
does being able to watch the trailer for "Chicken Run" in a 2-inch
square window on the computer screen after a several minute long
download, constitute a major advance and if we could cut
the download time to several seconds, would that qualify?]
Most insidious, I think, is that the rush to get more information
faster almost forces people to avoid the act of thinking. Why
stop and try to make sense of the information weve obtained
when we can click on that icon and get still more data? And more.
KASPER, a physicist, is Associate Vice Provost for Research at
Columbia University and was Associate Director of the Superconducting
Super Collider Laboratory.
Jason McCabe Calacanis
long before all nations obey the basic principles of the human
rights as outlined in the Universal Declaration of Human Rights
on December 10th, 1948?"
The distinctive Amnesty International arched sticker, with a burning
candle surrounded by a swoosh of barbed wire, seemed to adorn
every college dorm-room door, beat up Honda Accord, and office
bulletin board when I started college in the late '80s at Fordham
University. Human rights was the "in" cause. So, we all joined
Amnesty and watched our heroes including Bruce Springsteen, Sting,
and Peter Gabriel sing on the "Human Rights Now" tour (brought
to you, of course, by Reebok).
As quickly as it took center stage, however, human rights seemed
to fall off the map. Somewhere in the mid-90s, something stole
our fire and free time, perhaps it was the gold rush years of
the Internet or the end of the Cold War. The wild spread of entrepreneurship
and capitalism may have carried some democracy along with it.
Yet just because people are starting companies and economic markets
are opening up doesn't mean that there are fewer tortures, rapes,
and murders for political beliefs. (These kinds of false perceptions
may stem from giving places like China "Most Favored Nation" status).
Youth inspired by artists created the foundation of Amnesty's
success in the '80s, so maybe a vacuum of activist artists is
to blame for human rights disappearing from the collective consciousness.
Would a homophobic, misogynistic, and violent artist like Eminem
ever take a stand for anyone other than himself? Could anyone
take him seriously if he did? Britney Spears' fans might not have
a problem with her dressing in a thong at the MTV Music Awards
but how comfortable would they be if she addressed the issue of
the rape, kidnapping, and torture of young women in Sierra Leone?
Of course, you don't have to look around the world to find human-rights
abuses. Rodney King and Abner Louima taught us that human rights
is an important and pressing issue right in our backyard. (Because
of these examples, some narrow-minded individuals may see is as
only a race specific issue.) One bright spot in all of this, however,
is that the technology that was supposed to create a Big Brother
state, like video cameras, is now being used to police Big Brother
himself. (Check out witness.org and send them a check or
a video camera if you have the means.)
Eleanor Roosevelt considered her fight to create the Universal
Declaration of Human Rights her greatest accomplishment. How ashamed
would she be that 50 years has elapsed since her battle, and now,
no one seems to care.
McCABE CALACANIS is Editor and Publisher of Silicon Alley Daily;
The Digital Coast Weekly, Silicon Alley Reporter and Chairman
CEO, Rising Tide Studios.
seems to me weve surrendered the notion of the sacred to
those who only mean to halt the evolution of culture. Things we
call "sacred" are simply ideologies and truths so successfully
institutionalized that they seem unquestionable. For example,
the notion that sexual imagery is bad for young people to see
a fact never established by any psychological or anthropological
study Ive come across is accepted as God-ordained
fact, and used as a fundamental building block to justify censorship.
(Meanwhile, countless sitcoms in which parents lie to one another
are considered wholesome enough to earn "G" television
politicians claim to be "God-fearing" is meant
to signify that he has priorities greater than short-term political
gain. What most people dont realize is that, in the Bible
anyway, God-fearing is a distant second to God-loving. People
who were God-fearing only behaved ethically because they were
afraid of the Hebrew Gods wrath. This wasnt a sacred
relationship at all, but the self-interested avoidance of retaliation.
it seems that no place, and more importantly no
time is truly sacred. Our mediating technologies render us available
to our business associates at any hour, day or night. Any moment
spent thinking instead of spending, or laughing instead of working
is an opportunity missed. And the more time we sacrifice to production
and consumption, the less any alternative seems available to us.
radical proposal to combat the contraction of sacred time was
suggested in the book of Exodus, and it's called the Sabbath.
What if we all decided that for one day each week, we would refrain
from buying or selling anything? Would it throw America into a
recession? Maybe the ancients didn't pick the number seven out
of a hat. Perhaps they understood that human beings can only immerse
themselves in commerce for six days at a stretch before losing
touch with anything approaching the civil, social, or sacred.
RUSHKOFF is the author of Coercion, Media Virus, Playing the
Future, Ecstasy Club. Professor of Virtual Culture, New York
If Barbour's theory of Platonia is even roughly correct, then
everything exists in a timless universe, and therefore doesn't
actually "disappear". Therefore, all questions are always
asked, as everything is actually happening at once. I know that
doesn't help much, and it dodges the main thrust of the question,
but it's one support for my answer, if oblique.
Other than forgotten questions that disappear of their
own accord, or are in some dead language, or are too personal/particular/atomised
(i.e., What did you think of the latest excretion from Hollywood?
Is it snowing now? Why is that weirdo across the library reading
room looking at me?!?! When will I lose these 35 "friends"
who are perched on my belt buckle? etc.) questions don't really
disappear. They are asked again and again and are answered again
and again, and this is a very good thing. Three Year Olds will
always ask "Daddy, where do the stars come fwum?" And
daddys will always answer as best they can. Eventually, some little
three year old will grow into an adult astronomer and might find
even better answers than their daddy supplied them on a cold Christmas
night. And they will answer the same simple question with a long
involved answer, or possibly, a better and simpler answer. In
this way, questions come up again and again, but over time they
spin out in new directions with new answers.
It's important to not let questions disappear. By doubting the
obvious, examining the the same ground with fresh ideas, and questioning
recieved ideas, great strides in the collected knowledge of this
human project can be (and historically, have been) gained. When
we consign a question to the scrap heap of history we run many
risks risks of blind arrogance, deaf self righteousness,
and finally choking on the bland pablum of unquestioned dogma.
It's important to question the questions. It keeps the question
alive, as it refines the question. Question the questions, and
then reverse the process - question the questioning of questions.
Permit the mind everything, even if it seems repetitive. If you
spin your wheels long enough you'll blow a bearing or snap a spring,
and the question is re-invented, re-asked, and re-known, but in
a way not previously understood. In this way, questions don't
disappear, they evolve into other questions. For a while they
might bloat up in the sun and smell really weird, but it's all
part of the process...
WARWICK sometimes works as a scientist in the computer industry.
He always works as an artist, composer, and writer. He lives in
San Francisco, California.
"Why Is There Something Instead of Nothing?"
is a question that the ancients asked, and one that crops up a
few times in 20th century philosophical discussions. When it is
mentioned, it is usually as an example of a problem that looks
to be both deep and in principle insoluble. Unsurprisingly, then,
it seems to have fallen by the scientific, cosmological and philosophical
waysides. But sometimes I wonder whether it really is insoluble
(or senseless), or whether science may one day surprise us by
finding an answer.
CLARK is Professor of Philosophy and Cognitive Science at the
University of Sussex, UK. He was previously Director of the Philosophy/Neuroscience/Psychology
Program at Washington University in St. Louis. He is the author
of Microcognition: Philosophy, Cognitive Science and Parallel
Distributed Processing, Associative Engines, and Being
There: Putting Brain, Body and World Together Again.
It has become unfashionable to ask about the structure of reality
without already having chosen a framework in which to ponder the
answer, be it scientific, religious or sceptical. A sense of wonder
at the sheer appearance of the world, moment by moment, has been
To look at the world in wonder, and to stay with that sense of
wonder without jumping straight past it, has become almost impossible
for someone taking science seriously. The three dominant reactions
are: to see science as the only way to get at the truth, at what
is really real; to accept science but to postulate a more encompassing
reality around or next to it, based on an existing religion; or
to accept science as one useful approach in a plurality of many
approaches, neither of which has anything to say about reality
in any ultimate way.
The first reaction leads to a sense of wonder scaled down to the
question of wonder about the underlying mathematical equations
of physics, their interpretation, and the complexity of the phenomena
found on the level of chemistry and biology. The second reaction
tends to allow wonder to occur only within the particular religous
framework that is accepted on faith. The third reaction allows
no room for wonder about reality, since there is no ultimate reality
to wonder about.
Having lost our ability to ask what reality is like means having
lost our innocence. The challenge is to regain a new form of innocence,
by accepting all that we can learn from science, while simultaneously
daring to ask 'what else is true?' In each period of history,
the greatest philosophers struggled with the question of how to
confront skepticism and cynicism, from Socrates and Descartes
to Kant and Husserl in Europe, and Nagarjuna and many others in
Asia and elsewhere. I hope that the question "What is Reality?"
will reappear soon, as a viable intellectual question and at the
same time as an invitation to try to put all our beliefs and frameworks
on hold. Looking at reality without any filter may or may not
be possible, but without at least trying to do so we will have
given up too soon.
PIET HUT is professor of astrophysics at the Institute for Advanced
Study, in Princeton. He is involved in the project of building
GRAPEs, the world's fastest special-purpose computers, at Tokyo
University, and he is also a founding member of the Kira Institute.
''How does [fill in the blank] in human affairs relate to the
great central theory?''
I do not, of course, mean any particular Great Central Theory.
I am referring to the once-pervasive habit of relating everything
that had human scale Chinese history, the Odyssey, your
mother's fear of heights to an all-explaining principle.
This principle was set forth in a short shelf of classic works
and then worked to a fine filigree by close-minded people masquerading
as open-minded people. The precise Great Central Theory might
be, as it was in my childhood milieu, the theories of Freud. It
might be Marx. It might be Levi-Strauss or, more recently, Foucault.
At the turn of the last century, there was a Darwinist version
going, promulgated by Francis Galton, Herbert Spencer and their
These monolithic growths had begun, I suppose, as the answers
to specific questions, but then they metastasized; their adherents
would expect the great central theory to answer any question.
Commitment to a Great Central Theory thus became more a religious
act than an intellectual one. And, as with all religions, the
worldview of the devout crept into popular culture. (When I was
in high school we'd say So-and-So was really anal about his locker
or that What's-his-name's parents were really bourgeois.) For
decades, this was what intellectual life appeared to be: Commit
to an overarching explanation, relate it to everything you experienced,
defend it against infidels. Die disillusioned, or, worse, die
So why has this sort of question vanished? My guess is that, broadly
speaking, it was a product of the Manichean worldview of the last
century. Depression, dictators, war, genocide, nuclear terror
all of these lend themselves to a Yes-or-No, With-Us-or-With-Them,
Federation vs. Klingons mindset. We were, to put it simply, all
a little paranoid. And paranoids love a Great Key: Use this and
see the single underlying cause for what seems to be unrelated
Nowadays the world, though no less dangerous, seems to demand
attention to the seperateness of things, the distinctiveness of
questions. ''Theories of everything'' are terms physicists use
to explain their near-theological concerns, but at the human scale
most people care about, where we ask questions like ''why can't
we dump the Electoral College?'' or ''How come Mom likes my sister
better?'', the Great Central Theory question has vanished with
the black-or-white arrangement of the human world.
some new Great Central Theory slouches in; some of the Darwinians
think they've got the candidate, and they certainly evince signs
of quasi-religious commitment. (For example, as a Freudian would
say you doubted Freud because of your neuroses, I have heard Darwinians
say I doubted their theories because of an evolved predisposition
not to believe the truth. I call this quasi-religious because
this move makes the theory impregnable to evidence or new ideas.)
the notion that overarching theory is impossible becomes, itself,
a new dogma. I lean toward this prejudice myself but I recognize
its dangers. An intellectual life that was all boutiques could
be, in its way, as stultifying as a giant one-product factory.
we learn from the mistakes of the last two centuries and insist
that our answers always match our questions, and that the distinction
between theory and religious belief be maintained.
DAVID BERREBY'S writing about science and culture has appeared
in The New York Times Magazine, The New Republic, Slate, The
Sciences and many other publications.
do women want?"
People in the Western world assume women have it all: education,
job opportunities, birth control, love control, and financial
freedom. But women still lack the essential freedom equality
they lacked a century ago. Women are minorities in every
sector of our government and economy, and women are still expected
to raise families while at the same time earning incomes that
are comparably lower than what males earn. And in our culture,
women are still depicted as whores, bimbos, or bloodsuckers by
advertisers to sell everything from computers to cars.
it take another century or another millenium before the biological
differences between men and women are taken as a carte blanche
justification for the unequal treatment of women?
SYLVIA PAULL is Founder, Gracenet (www.gracenet.net) Serving women
in high-tech and business media.
"The Great Idea That's Disappeared"
greatest idea that's disappeared from mainstream science this
past 400 years is surely that of God. The greats who laid the
foundations of modern science in the 17th century (Galileo, Newton,
Leibnitz, Descartes) and the significant-but-not-quite-so-greats
(Robert Boyle, John Ray, etc.) were theologians as much as they
were scientists and philosophers. They wanted to know how things
are, of course but also what God had in mind when he made
them this way. They took it for granted, or contrived to prove
to their own satisfaction, that unless there is a God, omniscient
and mindful, then there could be no Universe at all.
Although David Hume did much to erode such argument, it persisted
well into the 19th century. Recently I have been intrigued to
find James Hutton who, as one of the founders of modern
geology, is one of the boldest and most imaginative of all scientists
earnestly wondering in a late 18th century essay what God
could possibly have intended when he made volcanoes. The notion
that there could be no complex and adapted beings at all without
a God to create them, was effectively the default position in
orthodox biology until (as Dan Dennett has so succinctly explained)
Charles Darwin showed how natural selection could produce complexity
out of simplicity, and adaptation out of mere juxtaposition. Today,
very obviously, no Hutton-style musing would find its way into
a refereed journal. In Nature, God features only as the
subject of (generally rather feeble) sociological and sometimes
Religion obviously flourishes still, but are religion and science
now condemned to mortal conflict? Fundamentalist-atheists would
have it so, but I think not. The greatest ideas in philosophy
and science never really go away, even if they do change their
form or go out of fashion, but they do take a very long time to
unfold. For at least 300 years from the 16th to the 19th
centuries emergent science and post- medieval theology
were deliberately intertwined, in many ingenious ways. Through
the past 150, they have been just as assiduously disentangled.
But the game is far from over. Cosmologists and metaphysicians
continue to eye and circle each other. Epistemology how
we know what's true is of equal interest to scientists
and theologians, and each would be foolish to suppose that the
other has nothing to offer. How distant is the religious notion
of revelation from Dirac's or Keats's? perception
of truth as beauty? Most intriguingly of all, serious theologians
are now discussing the role of religion in shaping emotional response
while modern aficionados of artificial intelligence acknowledge
(as Hume did) that emotion is an essential component of thought
itself. Lastly, the ethics of science and technology how
we should use our new-found power are the key discussions
of our age and it is destructive to write religion out of the
act, even if the priests, rabbis and mullahs who so far have been
invited to take part have often proved disappointing.
I don't share the modern enthusiasm for over-extended life but
I would like to see how the dialogue unfolds in the centuries
COLIN TUDGE is a Research Fellow at the Centre for Philosophy,
London School of Economics. His two latest books are The Variety
of Life and In Mendel's Footnotes.
Was Lost Atlantis?"
Two journalists once ranked the discovery of lost Atlantis as
potentially the most spectacular sensation of all times. Now,
the question what or where Atlantis might have been has disappeared.
The Greek philosopher Plato, the only source for Atlantis, incorporated
an extensive description of this legendary city into a mundane
summary of contemporary (4th century BC) scientific achievements
and knowledge of prehistory. Nobody attributed much attention
to the account during subsequent centuries. In Medieval times,
scholarly interest focussed on Aristotle, while Plato was neglected.
When archaeology and history finally assumed the shape of scientific
disciplines after the middle of the 18th century AD
science still was under the influence of Christian theology, its
Medieval mother discipline. The first art historians, who were
brought up in a creationist world, consequently interpreted western
culture as an almost divine concept which first materialized in
ancient Greece, without having had any noticeable predecessors.
Accordingly, any ancient texts referring to high civilizations,
much older than Classical Greece, had to be fictitious by definition.
During the 20th century, dozens of palaces dating to a golden
age a thousand years older than Plato's Athens have been excavated
around the eastern Mediterranean. Atlantis can now be placed in
a historical context. It is an Egyptian recollection of Bronze
Age Troy and its awe-inspiring war against the Greek kingdoms.
Plato's account and the end of the Bronze Age around 1200 BC can
now be seen in a new light. Why was this connection not made earlier?
Four Egyptian words, describing location and size, were mistranslated,
because at the time Egypt and Greece used different calendars
and scales. And, in contrast to biology, where, after Darwin,
the idea of creationism was dropped in favor of evolutionism,
Aegean prehistory has never questioned its basic premises.
Geoarchaeologist EBERHARD ZANGGER is Director of Corporate Communications
at KPNQwest (Switzerland) and the author of The Flood from
Heaven : Deciphering the Atlantis Legend and Geoarchaeology
of the Argolid. Zangger has written a monograph, published
by the German Archaeological Institute, as well as more than seventy
scholarly articles, which have appeared in the American Journal
of Archaeology, Hesperia, the Oxford Journal of Archaeology,
and the Journal of Field Archaeology.
and Inappropriate Metaphors"
The detection of "questions that are no longer asked"
is difficult. Old questions, like MacArthur's old soldiers, just
fade away. Scientists and scholars in hot pursuit of new questions
neither note nor mourn their passing. I regularly face a modest
form of the disappearing question challenge when a textbook used
in one of my classes is revised. Deletions are hard to find; they
leave no voids and are more stealthy than black holes, not even
affecting their surrounds. New text content stands out, while
missing material must be established through careful line-by-line
reading. Whether in textbooks or in life, we don't think much
about what is no longer relevant.
response to the inquiry about questions that are no longer asked
is to reframe it and suggest instead a common class of missing
questions, those associated with obsolete and inappropriate metaphors.
Metaphor is a powerful cognitive tool, which, like all models,
clarifies thinking when appropriate, but constrains it when inappropriate.
Science is full of them. My professional specialties of neuroscience
and biopsychology has mind/brain metaphors ranging from Locke's
ancient blank slate (tabula rasa), to the more technologically
advanced switchboard, and the metaphor de jour, the computer.
None do justice to the brain as a soggy lump of wetware, but linger
as cognitive/linguistic models. Natural selection in the realm
of metaphors is slow and imperfect. Witness the reference to DNA
as a "blueprint" for an organism, when Dawkins' "recipe"
metaphor more accurately reflects DNA's incoding of instructions
for organismic assembly.
R. PROVINE is Professor of Psychology and Neuroscience at the
University of Maryland, Baltimore County, and author of Laughter:
A Scientific Investigation.