"WHAT HAVE YOU CHANGED YOUR MIND ABOUT?" |
|
CHARLES SEIFE
Professor
of Journalism, New York University; formerly journalist,
Science magazine; Author, Zero: The Biography
Of A Dangerous Idea

I used to think that a modern, democratic society
had to be a scientific society. After all, the scientific
revolution and the American Revolution were forged
in the same flames of the enlightenment. Naturally,
I thought, a society that embraces the freedom of thought
and expression of a democracy would also embrace science.
However, when I first started reporting on science,
I quickly realized that science didn't spring up naturally
in the fertile soil of the young American democracy.
Americans were extraordinary innovators — wonderful
tinkerers and engineers — but you can count the great
19th century American physicists on one hand and have
two fingers left over. The United States owes its scientific
tradition to aristocratic Europe's universities (and
to its refugees), not to any native drive.
In fact, science clashes with the democratic ideal.
Though it is meritocratic, it is practiced in the elite
and effete world of academe, leaving the vast majority
of citizens unable to contribute to it in any meaningful
way. Science is about freedom of thought, yet at the
same time it imposes a tyranny of ideas.
In a democracy, ideas are protected. It's the sacred
right of a citizen to hold — and to disseminate —
beliefs that the majority disagrees with, ideas that
are abhorrent, ideas that are wrong. However, scientists
are not free to be completely open minded; a scientist
stops becoming a scientist if he clings to discredited
notions. The basic scientific urge to falsify, to disprove,
to discredit ideas clashes with the democratic drive
to tolerate and protect them.
This
is why even those politicians who accept evolution
will never attack those politicians who don't; at least
publicly, they cast evolutionary theory as a mere personal
belief. Attempting to squelch creationism smacks of
elitism and intolerance — it would be political suicide.
Yet this is exactly what biologists are compelled to
do; they exorcise falsehoods and drive them from the
realm of public discourse.
We've been lucky that the transplant of science has
flourished so beautifully on American soil. But I no
longer take it for granted that this will continue;
our democratic tendencies might get the best of us
in the end. |
DAVID
BODANIS
Writer;
Consultant; Author, Passionate
Minds

The
Bible Is Inane
When I was very little the question was easy. I simply
assumed the whole Bible was true, albeit in a mysterious,
grown-up sort of way. But once I learned something of science,
at school and then at university, that unquestioning belief
slid away.
Mathematics
was especially important here, and I remember how entranced
I was when I first saw the power of axiomatic systems.
Those were logical structures that were as beautiful
as complex crystals — but far, far clearer.
If there was one inaccuracy at any point in the system,
you could trace it, like a scarcely visible stretching
crack through the whole crystal; you could see exactly
how it had to undermine the validity of far distant parts
as well. Since there are obvious factual inaccuracies in
the Bible, as well as repugnant moral commands, then — just
as with any tight axiomatic system — huge other
parts of it had to be wrong, as well. In my mind that discredited
it all.
What I've come to see more recently is that the Bible
isn't monolithic in that way. It's built up in many, often
quite distinct layers. For example, the book of Joshua
describes a merciless killing of Jericho's inhabitants,
after that city's walls were destroyed. But archaeology
shows that when this was supposed to be happening, there
was no large city with walls there to be destroyed. On
the contrary, careful dating of artifacts, as well as translations
from documents of the great empires in surrounding regions,
shows that the bloodthirsty Joshua story was quite likely
written by one particular group, centuries later, trying
to give some validity to a particular royal line in 7th
century BC Jerusalem, which wanted to show its rights to
the entire country around it. Yet when that Joshua layer
is stripped away, other layers in the Bible remain. They
can stand, or be judged, on their own.
A
few of those remaining layers have survived only because
they became taken up by narrow power structures, concerned
with aggrandizing themselves, in the style of Philip Pullman's
excellent books. But others have survived across the millennia
for different reasons. Some speak to the human condition
with poetry of aching beauty. And others — well,
there's a further reason I began to doubt the inanity of
everything I couldn't understand.
A child age three, however intelligent, and however much
it squinches his or her fact tight in concentration, still
won't be able to grasp notions that are easy for us, such
as 'century', or 'henceforth', let alone greater subtleties
which 20th century science has clarified, such as 'simultaneity'
or 'causality'. True and important things exist, which
young children can't comprehend. It seems odd to be sure
that we, adult humans, existing at this one particular
moment in evolution, have no such limits.
I
realized that the world isn't divided into science
on the one hand, and nonsense or arbitrary biases on
the other. And I wonder now what might be worth looking
for, hidden there, fleetingly in-between. |
HAIM
HARARI
Physicist,
former President, Weizmann Institute
of Science

Clear and simple is not the same as provable
and well defined
I used to think that if something is clear and simple,
it must also be provable or at least well defined, and
if something is well defined, it might be relatively
simple. It isn't so.
If you hear about sightings of a weird glow approaching
us in the night sky, it might be explained as a meteorite
or as little green men arriving in a spaceship from another
galaxy. In most specific cases, both hypotheses can be
neither proved nor disproved, rigorously. Nothing is
well defined here. Yet, it is clear that the meteorite
hypothesis is scientifically much more likely.
When you hear about a new perpetual motion machine or
about yet another claim of cold fusion, you raise an
eyebrow, you are willing to bet against it and, in your
guts, you know it is wrong, but it is not always easy
to disprove it rigorously.
The reliability of forecasts regarding weather, stock
markets and astrology is descending in that order. All
of them are based on guesses, with or without historical
data. Most of them are rarely revisited by the media,
after the fact, thus avoiding being exposed as unreliable.
In most cases, predicting that the immediate future will
be the same as the immediate past, has a higher probability
of being correct, than the predictions of the gurus.
Yet, we, as scientists, have considerable faith in weather
predictions; much less faith in predicting peaks and
dips of the stock market and no faith at all is astrology.
We can explain why, and we are certainly right, but we
cannot prove why. Proving it by historical success data,
is as convincing (for the future) as the predictions
themselves.
Richard Feynman in his famous Lectures on Physics provided
the ultimate physics definition of Energy: It is that
quantity which is conserved. Any Lawyer, Mathematician
or Accountant would have laughed at this statement. Energy
is perhaps the most useful, clear and common concept
in all of science, and Feynman is telling us, correctly
and shamelessly, that it has no proper rigorous and logical
definition.
How
much is five thousand plus two? Not so simple. Sometimes
it is five thousands and two (as in your bank statement)
and sometimes it is actually five thousand (as in the
case of the Cairo tour guide who said "this pyramid
is 5002 years old; when I started working here two years
ago, I was told it was 5000 years old").
The public thinks, incorrectly, that science is a very
accurate discipline where everything is well defined.
Not so. But the beauty of it is that all of the above
statements are scientific, obvious and useful, without
being precisely defined. That is as much part of the
scientific method as verifying a theory by an experiment
(which is always accurate only to a point).
To
speak and to understand the language of science is,
among other things, to understand this "clear vagueness".
It exists, of course, in other areas of life. Every normal
language possesses numerous such examples, and so do
all fields of social science.
Judaism is a religion and I am an atheist. Nevertheless,
it is clear that I am Jewish. It would take a volume
to explain why, and the explanation will remain rather
obscure and ill defined. But the fact is simple, clear,
well understood and undeniable.
Somehow, it is acceptable to face such situations in
nonscientific matters, but most people think, incorrectly,
that the quantitative natural sciences must be different.
They are different, in many ways, but not in this way.
Common sense has as much place as logic, in scientific
research. Intuition often leads to more insight than
algorithmic thinking. Familiarity with previous failed
attempts to solve a problem may be detrimental, rather
than helpful. This may explain why almost all important
physics breakthroughs are made by people under forty.
This also explains why, in science, asking the right
question is at least as important as being able to solve
a well posed problem.
You
might say that the above kind of thinking is prejudiced
and inaccurate, and that it might hinder new discoveries
and new scientific ideas. Not so. Good scientists know
very well how to treat and use all of these "fuzzy" statements.
They also know how to reconsider them, when there is
a good reason to do so, based on new solid facts or on
a new original line of thinking. This is one of the beautiful
features of science. |
TIMOTHY
TAYLOR
Archaeologist,
University of Bradford; Author, The
Buried Soul

Relativism
Where
once I would have striven to see Incan child sacrifice
'in their terms', I am increasingly committed to seeing
it in ours. Where once I would have directed attention
to understanding a past cosmology of equal validity to
my own, I now feel the urgency to go beyond a culturally-attuned
explanation and reveal cold sadism, deployed as a means
of social control by a burgeoning imperial power.
In
Cambridge at the end of the 70s, I began to be inculcated
with the idea that understanding the internal logic and
value system of a past culture was the best way to do
archaeology and anthropology. The challenge was to achieve
this through sensitivity to context, classification and
symbolism. A pot was no longer just a pot, but a polyvalent
signifier, with a range of case-sensitive meanings. A
rubbish pit was no longer an unproblematic heap of trash,
but a semiotic entity embodying concepts of contagion
and purity, sacred and profane. A ritual killing was
not to be judged bad, but as having validity within a
different worldview.
Using
such 'contextual' thinking, a lump of slag found in a
5000 BC female grave in Serbia was no longer seen as
chance contaminant — bi-product garbage from making
copper jewelry. Rather it was a kind of poetic statement
bearing on the relationship between biological and cultural
reproduction. Just as births in the Vin?a culture were
attended by midwives who also delivered the warm but
useless slab of afterbirth, so Vinca culture ore was
heated in a clay furnace that gave birth to metal. From
the furnace — known from many ethnographies to
have projecting clay breasts and a graphically vulvic
stoking opening — the smelters delivered technology's
baby. With it came a warm but useless lump of slag. Thus
the slag in a Vinca woman's grave, far from being accidental
trash, hinted at a complex symbolism of gender, death
and rebirth.
So
far, so good: relativism worked as a way towards understanding
that our industrial waste was not theirs, and their idea
of how a woman should be appropriately buried not ours.
But what happens when relativism says that our concepts
of right and wrong, good and evil, kindness and cruelty,
are inherently inapplicable? Relativism self-consciously
divests itself of a series of anthropocentric and anachronistic
skins — modern, white, western, male-focused,
individualist, scientific (or 'scientistic') — to
say that the recognition of such value-concepts is radically
unstable, the 'objective' outsider opinion a worthless
myth.
My
colleague Andy Wilson and our team have recently examined
the hair of sacrificed children found on some of the
high peaks of the Andes. Contrary to historic chronicles
that claim that being ritually killed to join the mountain
gods was an honour that the Incan rulers accorded only
to their own privileged offspring, diachronic isotopic
analyses along the scalp hairs of victims indicate that
it was peasant children, who, twelve months before death,
were given the outward trappings of high status and a
much improved diet to make them acceptable offerings.
Thus we see past the self-serving accounts of those of
the indigenous elite who survived on into Spanish rule.
We now understand that the central command in Cuzco engineered
the high-visibility sacrifice of children drawn from
newly subject populations. And we can guess that this
was a means to social control during the massive, 'shock & awe'
style imperial expansion southwards into what became
Argentina.
But
the relativists demur from this understanding, and have
painted us as culturally insensitive, ignorant scientists
(the last label a clear pejorative). For them, our isotope
work is informative only as it reveals 'the inner fantasy
life of, mostly, Euro-American archaeologists, who can't
possibly access the inner cognitive/cultural life of
those Others.' The capital 'O' is significant. Here we
have what the journalist Julie Burchill mordantly unpacked
as 'the ever-estimable Other' — the albatross that
post-Enlightenment and, more importantly, post-colonial
scholarship must wear round its neck as a sign of penance.
We
need relativism as an aid to understanding past cultural
logic, but it does not free us from a duty to discriminate
morally and to understand that there are regularities
in the negatives of human behaviour as well as in its
positives. In this case, it seeks to ignore what Victor
Nell has described as 'the historical and cross-cultural
stability of the uses of cruelty for punishment, amusement,
and social control.' By denying the basis for a consistent
underlying algebra of positive and negative, yet consistently
claiming the necessary rightness of the internal cultural
conduct of 'the Other', relativism steps away from logic
into incoherence. |
LEON
LEDERMAN
Physicist
and Nobel Laureate; Director Emeritus, Fermilab; Coauthor, The
God Particle

The
Obligations and Responsibilities of The Scientist
My
academic experience, mainly at Columbia University from
1946-1978, instilled the following firm beliefs:
The role of the Professor, reflecting the mission of the
University, is research and dissemination of the knowledge
gained. However, the Professor has many citizenship obligations:
to his community, State and Nation, to his University,
to his field of research, e.g. physics, to his students.
In the latter case, one must add to the content knowledge
transferred, the moral and ethical concerns that science
brings to society. So scientists have an obligation to
communicate their knowledge, popularize, and whenever relevant,
bring his knowledge to bear on the issues of the time.
However, additionally, scientists play a large role in advisory boards and systems
from the President's Advisory system all the way to local
school boards and PTAs. I have always believed that the
above menu more or less covered all the obligations and
responsibilities of the scientist. His most sacred obligation
is to continue to do science. Now I know that I was dead
wrong.
Taking even a cursory stock of current events, I am driven
to the ultimately wise advice of my Columbia mentor, I.I.
Rabi, who, in our many corridor bull sessions, urged his
students to run for public office and get elected. He insisted
that to be an advisor (he was an advisor to Oppenheimer
at Los Alamos, later to Eisenhower and to the AEC) was
ultimately an exercise in futility and that the power belonged
to those who are elected. Then, we thought the old man
was bonkers. But today......
Just
look at our national and international dilemmas: global
climate change (U.S. booed in Bali); nuclear weapons
(seventeen years after the end of the Cold War, the U.S.
has over 7,000 nuclear weapons, many poised to instant
flight. Who decided?); stem cell research (still hobbled
by White House obstacles). Basic research and science education
are rated several nations below "Lower Slobovenia",
our national deficit will burden the nation for generations,
a wave of religious fundamentalism, an endless war in Iraq
and the growing security restrictions on our privacy and
freedom (excused by an even more endless and mindless war
on terrorism) seem to be paralyzing the Congress. We need
to elect people who can think critically.
A Congress which is overwhelmingly dominated by lawyers
and MBAs makes no sense in this 21st century in which almost
all issues have a science and technology aspect. We need
a national movement to seek out scientists and engineers
who have demonstrated the required management and communication
skills. And we need a strong consensus of mentors that
the need for wisdom and knowledge in the Congress must
have a huge priority. |
DAN
SPERBER
Social
and cognitive scientist; Directeur de Recherche,
CNRS, Paris; Author, Rethinking Symbolism

How
I Became An Evolutionary Psychologist
As
a student, I was influenced by Claude Lévi-Strauss
and even more by Noam Chomsky. Both of them dared talk
about "human nature" when the received view
was that there was no such thing. In my own work, I argued
for a naturalistic approach in the social sciences. I
took for granted that human cognitive dispositions were
shaped by biological evolution and more specifically
by Darwinian selection. While I did occasionally toy
with evolutionary speculations, I failed to see at the
time how they could play more than a quite marginal role
in the study of human psychology and culture.
Luckily,
in 1987, I was asked by Jacques Mehler, the founder and
editor of Cognition, to review a very long article
intriguingly entitled "The logic of social exchange:
Has natural selection shaped how humans reason?" In
most experimental psychology articles the theoretical
sections are short and relatively shallow. Here, on the
other hand, the young author, Leda Cosmides, was arguing
in an altogether novel way for an ambitious theoretical
claim. The forms of cooperation unique to and characteristic
of humans could only have evolved, she maintained, if
there had also been, at a psychological level, the evolution
of a mental mechanism tailored to understand and manage
social exchanges and in particular to detect cheaters.
Moreover, this mechanism could be investigated by means
of standard reasoning experiments.
This
is not the place to go into the details of the theoretical
argument — which I found and still find remarkably
insightful
— or of the experimental evidence — which I
have criticized in detail with experiments of my own as
inadequate. Whatever its shortcoming, this was an extraordinarily
stimulating paper, and I strongly recommended acceptance
of a revised version. The article was published in 1989
and the controversies it stirred have not yet abated.
Reading
the work of Leda Cosmides and of John Tooby, her collaborator
(and husband), meeting them shortly after, and initiating
a conversation with them that has never ceased made me
change my mind. I had known that we could reflect on
the mental capacities of our ancestors on the basis of
what we know of our minds; I now understood that we can
also draw fundamental insights about our present minds
through reflecting on the environmental problems and
opportunities that have exerted selective pressure on
our Paleolithic ancestors.
Ever
since, I have tried to contribute to the development
of evolutionary psychology, to the surprise and dismay
of some of my more standard-social-science friends and
also of some evolutionary psychologists who see me more
as a heretic than a genuine convert. True, I have no
taste or talent for orthodoxy. Moreover, I find much
of the work done so far under the label "evolutionary
psychology" rather disappointing. Evolutionary psychology
will succeed to the extent that it causes cognitive psychologists
to rethink central aspects of human cognition in an evolutionary
perspective, to the extent, that is, that psychology
in general becomes evolutionary.
The
human species is exceptional in its massive investment
in cognition, and in forms of cognitive activity — language,
higher-order thinking, abstraction — that are as
unique to humans as echolocation is to bats. Yet more
than half of all work done in evolutionary psychology
today is about mate choice, a mental activity found in
a great many species. There is nothing intrinsically
wrong in studying mate choice, of course, and some of
the work done in this area is outstanding.
However
the promise of evolutionary psychology is first and foremost
to help explain aspects of human psychology that are
genuinely exceptional among earthly species and that
in turn help explain the exceptional character of human
culture and ecology. This is what has to be achieved
to a much greater extent than has been the case so far
if we want more skeptical cognitive and social scientists
to change their minds too. |
THOMAS
METZINGER
Johannes Gutenberg-Universität Mainz;
Author, Being No One

There
are No Moral Facts
I have become convinced that it would be of fundamental importance to know what a good state of consciousness is. Are there forms of subjective experience which — in a strictly normative sense — are better than others? Or worse? What states of consciousness should be illegal? What states of consciousness do we want to foster and cultivate and integrate into our societies? What states of consciousness can we force upon animals — for instance, in consciousness research itself? What states of consciousness do we want to show our children? And what state of consciousness do we eventually die in ourselves?
2007 has seen the rise of an important new discipline: "neuroethics". This is not simply a new branch of applied ethics for neuroscience — it raises deeper issues about selfhood, society and the image of man. Neuroscience is now quickly transformed into neurotechnology. I predict that parts of neurotechnology will turn into consciousness technology. In 2002, out-of-body experiences were, for the first time, induced with an electrode in the brain of an epileptic patient. In 2007 we saw the first two studies, published in Science, demonstrating how the conscious self can be transposed to a location outside of the physical body as experienced, non-invasively and in healthy subjects. Cognitive enhancers are on the rise. The conscious experience of will has been experimentally constructed and manipulated in a number of ways. Acute episodes of depression can be caused by direct interventions in the brain, and they have also been successfully blocked in previously treatment-resistant patients. And so on.
Whenever we understand the specific neural dynamics underlying a specific form of conscious content, we can in principle delete, amplify or modulate this content in our minds. So shouldn’t we have a new ethics of consciousness — one that does not ask what a good action is, but that goes directly to the heart of the matter, asks what we want to do with all this new knowledge and what the moral value of states of subjective experience is?
Here is where I have changed my mind. There are no moral facts. Moral sentences have no truth-values. The world itself is silent, it just doesn’t speak to us in normative affairs — nothing in the physical universe tells us what makes an action a good action or a specific brain-state a desirable one. Sure, we all would like to know what a good neurophenomenological configuration really is, and how we should optimize our conscious minds in the future. But it looks like, in a more rigorous and serious sense, there is just no ethical knowledge to be had. We are alone. And if that is true, all we have to go by are the contingent moral intuitions evolution has hard-wired into our emotional self-model. If we choose to simply go by what feels good, then our future is easy to predict: It will be primitive hedonism and organized religion. |
MARC
D. HAUSER
Psychologist
and Biologist, Harvard University: Author, Moral
Minds

The
Limits Of Darwinian Reasoning
Darwin
is the man, and like so many biologists, I have benefited
from his prescient insights, handed to us 150 years
ago. The logic of adaptation has been a guiding engine
of my research and my view of life. In fact, it has
been difficult to view the world through any other
filter. I can still recall with great vividness the
day I arrived in Cambridge, in June 1992, a few months
before starting my job as an assistant professor at
Harvard. I was standing on a street corner, waiting
for a bus to arrive, and noticed a group of pigeons
on the sidewalk. There were several males displaying,
head bobbing and cooing, attempting to seduce the females.
The females, however, were not paying attention. They
were all turned, in Prussian solider formation, out
toward the street, looking at the middle of the intersection
where traffic was whizzing by. There, in the intersection,
was one male pigeon, displaying his heart out. Was
this guy insane? Hadn’t he read the handbook
of natural selection. Dude, it’s about survival.
Get out of the street!!!
Further
reflection provided the solution to this apparently
mutant, male pigeon. The logic of adaptation requires
us to ask about the costs and benefits of behavior,
trying to understand what the fitness payoffs might
be. Even for behaviors that appear absurdly deleterious,
there is often a benefit lurking. In the case of our
apparently suicidal male pigeon, there was a benefit,
and it was lurking in the females’ voyeurism,
their rubber necking. The females were oriented toward
this male, as opposed to the conservative guys on the
sidewalk, because he was playing with danger, showing
off, proving that even in the face of heavy traffic,
he could fly like a butterfly and sting like a bee,
jabbing and jiving like the great Muhammed Ali.
The
theory comes from the evolutionary biologist Amotz
Zahavi who proposed that even costly behaviors that
challenge survival can evolve if they have payoffs
to genetic fitness; these payoffs arrive in the currency
of more matings, and ultimately, more babies. Our male
pigeon was showing off his handicap. He was advertising
to the females that even in the face of potential costs
from Hummers and Beamers and Buses, he was still walking
the walk and talking the talk. The females were hooked,
mesmerized by this extraordinarily macho male. Handicaps
evolve because they are honest indicators of fitness.
And Zahavi’s theory represents the intellectual
descendent of Darwin’s original proposal.
I
must admit, however, that in recent years, I have made
less use of Darwin’s adaptive logic. It is not
because I think that the adaptive program has failed,
or that it can’t continue to account for a wide
variety of human and animal behavior. But with respect
to questions of human and animal mind, and especially
some of the unique products of the human mind — language,
morality, music, mathematics — I have, well,
changed my mind about the power of Darwinian reasoning.
Let
me be clear about the claim here. I am not rejecting
Darwin’s emphasis on comparative approaches,
that is, the use of phylogenetic or historical data.
I still practice this approach, contrasting the abilities
of humans and animals in the service of understanding
what is uniquely human and what is shared. And I still
think our cognitive prowess evolved, and that the human
brain and mind can be studied in some of the same ways
that we study other bits of anatomy and behavior. But
where I have lost the faith, so to speak, is in the
power of the adaptive program to explain or predict
particular design features of human thought.
Although
it is certainly reasonable to say that language, morality
and music have design features that are adaptive, that
would enhance reproduction and survival, evidence for
such claims is sorely missing. Further, for those who
wish to argue that the evidence comes from the complexity
of the behavior itself, and the absurdly low odds of
constructing such complexity by chance, these arguments
just don’t cut it with respect to explaining
or predicting the intricacies of language, morality,
music or many other domains of knowledge.
In
fact, I would say that although Darwin’s theory
has been around, and readily available for the taking
for 150 years, it has not advanced the fields of linguistics,
ethics, or mathematics. This is not to say that it
can’t advance these fields. But unlike the areas
of economic decision making, mate choice, and social
relationships, where the adaptive program has fundamentally
transformed our understanding, the same can not be
said for linguistics, ethics, and mathematics. What
has transformed these disciplines is our growing understanding
of mechanism, that is, how the mind represents the
world, how physiological processes generate these representations,
and how the child grows these systems of knowledge.
Bidding
Darwin adieu is not easy. My old friend has served
me well. And perhaps one day he will again. Until then,
farewell. |
ROBERT
PROVINE
Psychologist
and Neuroscientist, University of Maryland; Author, Laughter

In
Praise of Fishing Expeditions
Mentors,
paper referees and grant reviewers have warned me
on occasion about scientific "fishing expeditions," the
conduct of empirical research that does not test
a specific hypothesis or is not guided by theory.
Such "blind empiricism" was said to be
unscientific, to waste time and produce useless data.
Although I have never been completely convinced of
the hazards of fishing, I now reject them outright,
with a few reservations.
I'm
not advocating the collection of random facts, but
the use of broad-based descriptive studies to learn
what to study and how to study it. Those who fish
learn where the fish are, their species, number and
habits. Without the guidance of preliminary descriptive
studies, hypothesis testing can be inefficient and
misguided. Hypothesis testing is a powerful means
of rejecting error — of trimming the dead limbs
from the scientific tree — but it does not
generate hypotheses or signify which are worthy of
test. I'll provide two examples from my experience.
In graduate school, I became intrigued with neuroembryology
and wanted to introduce it to developmental psychology,
a discipline that essentially starts at birth. My dissertation
was a fishing expedition that described embryonic behavior
and its neurophysiological mechanism. I was exploring
uncharted waters and sought advice by observing the
ultimate expert, the embryo. In this and related work,
I discovered that prenatal movement is the product
of seizure-like discharges in the spinal cord (not
the brain), that the spinal discharges occurred spontaneously
(not a response to sensory stimuli), that the function
of movement was to sculpt joints (not to shape
postnatal behavior such walking), and to regulate the
number of motorneurons. Remarkable!
But
decades later, this and similar work is largely unknown
to developmental psychologists who have no category
for it. The traditional psychological specialties
of perception, learning, memory, motivation and the
like, are not relevant during most of the prenatal
period. The finding that embryos are profoundly unpsychological
beings guided by unique developmental priorities
and processes is not appreciated by theory-driven
developmental psychologists. When the fishing expedition
indicates that there is no appropriate spot in the
scientific filing cabinet, it may be time to add
another drawer.
Years
later and unrepentant, I embarked on a new fishing
expedition, this time in pursuit of the human universal
of laughter — what it is, when we do it, and
what it means. In the spirit of my embryonic research,
I wanted the expert to define my agenda—a laughing
person. Explorations about research funding with
administrators at a federal agency were unpromising.
One linguist patiently explained that my project "had
no obvious implications for any of the major theoretical
issues in linguistics." Another, a speech
scientist, noted that "laughter isn't speech,
and therefore had no relevance to my agency's mission."
Ultimately,
this atheoretical and largely descriptive work provided
many surprises and counterintuitive findings. For
example, laughter, like crying, is not consciously
controlled, contrary to literature suggesting that
we speak ha-ha as we would choose a word in speech.
Most laughter is not a response to humor. Laughter
and speech are controlled by different brain mechanisms,
with speech dominating laughter. Contagious laughter
is the product of neurologically programmed social
behavior. Contrasts between chimpanzee and human
laughter reveal why chimpanzees can't talk (inadequate
breath control), and the evolutionary event necessary
for the selection for human speech (bipedality).
Whether embryonic behavior or laughter, fishing expeditions
guided me down the appropriate empirical path, provided
unanticipated insights, and prevented flights of theoretical
fancy. Contrary to lifelong advice, when planning a
new research project, I always start by going fishing. |
TODD E. FEINBERG, M.D.
Professor
of Psychiatry and Neurology, Albert Einstein College
of Medicine; Author, Altered
Egos 
Soul
Searching
For most of my life I viewed any notion of the "soul" a
fanciful religious invention. I agreed with the
view of the late Nobel Laureate Francis Crick
who in his book The Astonishing Hypothesis claimed "A
modern neurobiologist sees no need for the religious
concept of a soul to explain the behavior of
humans and other animals." But is the idea
of a soul really so crazy and beyond the limits
of scientific reason?
From
the standpoint of neuroscience, it is easy to make
the claim that Descartes is simply wrong about the
separateness of brain and mind. The plain fact is
that there is no scientific evidence that a self,
an individual mind, or a soul could exist without
a physical brain. However, there are persisting reasons
why the self and the mind do not appear to be identical
with, or entirely reducible to, the brain.
For
example, in spite of the claims of Massachusetts
physician Dr. Duncan MacDougall, who estimated through
his experiments on dying humans that approximately
21 grams of matter — the presumed weight of
the human soul — was lost upon death (The
New York Times "Soul Has Weight, Physician
Thinks" March 11, 1907), unlike the brain, the
mind cannot be objectively observed, but only subjectively
experienced. The subject that represents the "I" in
the statement "I think therefore I am" cannot
be directly observed, weighed, or measured. And the
experiences of that self, its pains and pleasures,
sights and sounds possess an objective reality only
to the one who experiences them. In other words,
as the philosopher John Searle puts it, the mind
is "irreducibly first-person."
On
the other hand, although there are many perplexing
properties about the brain, mind, and the self that
remain to be scientifically explained — subjectivity
among them — this does not mean that there
must be an immaterial entity at work that explains
these mysterious features. Nonetheless, I have come
to believe that an individual consciousness represents
an entity that is so personal and ontologically unique
that it qualifies as something that we might as well
call "a soul."
I
am not suggesting that anything like a soul survives
the death of the brain. Indeed, the link between
the life of the brain and the life of the mind is
irreducible, the one completely dependant upon the
other. Indeed the danger of capturing the beauty
and mystery of a personal consciousness and identity
with the somewhat metaphorical designation "soul" is
the tendency for the grandiose metaphor to obscure
the actual accomplishments of the brain. The soul
is not a "thing" independent of the living
brain; it is part and parcel of it, its most remarkable
feature, but nonetheless inextricably bound to its
life and death. |
|