| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

next >




2008

"WHAT HAVE YOU CHANGED YOUR MIND ABOUT?"

GEORGE B. DYSON
Science Historian; Author, Project Orion

Russian America

Russians arrived on the western shores of North America after crossing their Eastern Ocean in 1741. After an initial period of exploration, they settled down for a full century until relinquishing their colonies to the United States. From 1799 to 1867, the colonies were governed by the Russian-American Company, a for-profit monopoly chartered under the deathbed instructions of Catherine the Great.

The Russian-American period has been treated unkindly by historians from both sides. Soviet-era accounts, though acknowledging the skill and courage of Russian adventurers, saw this Tsarist experiment at building a capitalist, American society as fundamentally flawed, casting the native Aleuts as exploited serfs. American accounts, glossing over our own subsequent exploitation of Alaska's indigenous population and natural resources, sought to emphasize that we liberated Alaska from Russian overseers who were worse, and would never be coming back.

Careful study of primary sources has convinced me that these interpretations are not supported by the facts. The Aleutian archipelago was a spectacularly rich environment with an unusually dense, thriving population whose physical and cultural well-being was devastated by contact with European invaders. But, as permanent colonists, the Russians were not so bad. The results were closer to the settlement of Greenland by Denmark than to our own settlement of the American West.

Although during the initial decades leading up to the consolidation of the Russian-American Company there was sporadic conflict (frequently disastrous to the poorly-armed and vastly-outnumbered Russians) with the native population, the colonies soon entered a relatively stable state based on cooperation, intermarriage, and official policies that provided social status, education, and professional training to children of mixed Aleut-Russian birth. Within a generation or two the day-to-day administration of the Russian-American colonies was largely in the hands of native-born Alaskans. As exemplified by the Russian adoption and adaptation of the Aleut kayak, or baidarka, many indigenous traditions and technologies (including sea otter hunting techniques, and the working of native copper deposits) were adopted by the new arrivals, reversing the usual trend in colonization, when indigenous technologies are replaced.

The Russians instituted public education, preservation of the Aleut language through transliteration of religious and other texts into Aleut via an adaptation of the Cyrillic alphabet, vaccination of the native population against smallpox, and science-based sea mammal conservation policies that were far ahead of their time. There were no such things as "reservations" for the native population in Russian America, and we owe as much to the Russians as to the Alaska Native Claims Settlement Act of 1971 that this remains true today.

The lack of support for the colonies by the home government (St. Petersburg was half a world away, and Empress Catherine's instructions a fading memory) eventually forced the sale to the United States, but also necessitated the resourcefulness and local autonomy that made the venture a success.

Russian America was a social and technological experiment that worked, until political compromises brought the experiment to a halt.


JUAN ENRIQUEZ
CEO, Biotechonomy; Founding Director, Harvard Business School's Life Sciences Project; Author, The Untied States of America

The source of long term power

Having grown up in Mexico, it took me a long time to understand what true power is and where it comes from. You see, I had great role models; perhaps that was part of the problem.  Throughout the developing world, as in Lansing and Iowa, there are many smart, tough, hardworking folk who still think power resides in today’s politics, unions, punditry, faith, painting, poetry, literature, agriculture, manufacturing, or architecture. Each sphere can immortalize or destroy individuals or regions. But, long term, to paraphrase Darwin, the only way for a country to survive and thrive is to adapt and adopt.

Problem is there is a winner’s bias. To paraphrase a winemaker, tradition is an experiment that worked. A religion originally thrives because adherents find it improves their lives. Sometimes faith is a way to cope with the horror of the present and to improve health and survival. (Suppose there is any regional basis for the coincidences in Kosher and Halal restrictions?)  But most religions, and most countries, forget they became powerful by continuously experimenting, learning, tweaking, improving. They begin to ossify myths and traditions. As others grow and thrive, they begin to fear change. They celebrate the past, becoming more nativist and xenophobic.

The rest of the world does not wait. It keeps gathering data. It keeps changing minds and methods. Successful religions and countries evolve new testaments and beliefs from the foundations of the old. The alternative is to merge, fragment, become irrelevant, go extinct. Museum basements are full of statues of once all powerful emperors and Gods that demanded, and got, blood sacrifices.

Who is truly powerful over the long term? Those running most of the successful experiments.  And nowhere does this happen faster, more effectively, and more often today than in science-related endeavors. True power flows primarily from science and knowledge.

As we double the amount of data generated by our species over the course of the next five years, universities and science-driven businesses are the key drivers of new experiments. You can find many examples of fast growing countries that are rich and poor, North and South, communist, capitalist, and socialist. But all have a core of great science schools and tech entrepreneurs. Meanwhile much of Latin America lacks many Silicon Valley wannabes. Start ups are rare. Serial entrepreneurs sell cheap Chinese imports on sidewalks instead of dreaming up IPOs. Scientists often earn less than accountants. Real growth has been absent for decades.

Few governments understand how quickly they must change, adopt, teach, and celebrate the new. (Never mind religions).  It is no coincidence that some of the fastest growing regions today were either isolated or largely irrelevant a few of decades ago. Singapore, Ireland, China, India and Korea, were considered basket cases. But often those with little to lose sometimes risk a new strategy.

Who eventually survives will be largely driven by understanding and applying digital and life code, by creating robots and nanomaterials, by working inconceivably large data sets and upgrading our brains. Meanwhile many U.S. leaders proudly proclaim no evolution and little knowledge of science. They reflect a core of scared voters experiencing massive disruption and declining wages; that core fears elite education, science, immigrants, open borders, and above all rapid change.  As income and knowledge gaps widen, many fall further and further behind; many grow to hate an open, knowledge driven economy. Change is rejected, blocked, vilified.

It took me a long time to shift focus from the politics, art, literature, and concerns of today towards the applied science of tomorrow. But had I not done that, I would have found it much harder to understand  which countries are likely to succeed and which could disappear.  And the disappearance and fragmentation of whole nations is an ever more common phenomenon.  Without the ability to adapt and adopt to science driven change, no matter what type of government, geography, ethnicity, or historic tradition you have, you will find that power devolves… even in the most powerful of empires.


REBECCA GOLDSTEIN
Philosopher, Harvard University; Author, Betraying Spinoza

Falsifiability

Edge’s question this year wittily refers to a way of demarcating science from philosophy and religion.  “When thinking changes your mind, that’s philosophy . . . .  When facts change your mind, that’s science.” Behind the witticism lies the important claim that science—or more precisely, scientific theories—can be clearly distinguished from all other theories, that scientific theories bear a special mark, and what this mark is is falsifiability. Said Popper:  The criterion of the scientific status of a theory is its falsifiability.  

For most scientists, this is all they need to know about the philosophy of science. It was bracing to come upon such a clear and precise criterion for identifying  scientific theories. And it was gratifying to see how Popper used it to discredit the claims that  psychoanalysis and Marxism are scientific theories. It had long seemed to me that the falsifiability test was basically right and enormously useful.

But then I started to read Popper’s work carefully, to teach him in my philosophy of science classes, and to look to scientific practice to see whether his theory survives the test of falsifiability (at least as a description of how successful science gets done). And I’ve changed my mind.

For one thing, Popper’s characterization of how science is practiced—as a cycle of conjecture and refutation—bears little relation to what goes on in the labs and journals. He describes science as if it was skeet-shooting, as if the only goal of science is to prove that one theory after another is false. But just open a copy of Science.  To pick a random example: “In a probabilistic learning task, A1-allele carriers with reduced dopamine D2 receptor densities learned to avoid actions with negative consequences less efficiently.” Not, “We tried to falsify the hypothesis that A1 carriers are less efficient learners, and failed.” Scientists rarely write the way that Popper says they should, and a good Popperian should recognize that the Master may have over-simplified the logic of theory testing.

Also, scientists don’t, and shouldn’t, jettison a theory as soon as a disconfirming datum comes in. As Francis Crick once said, “Any theory that can account for all of the facts is wrong, because some of the facts are always wrong.” Scientists rightly question a datum that appears to falsify an elegant and well-supported theory, and they rightly add assumptions and qualifications and complications to a theory as they learn more about the world. As Imre Lakatos, a less-cited (but more subtle) philosopher of science points out, all scientific theories are unfalsifiable. The ones we take seriously are those that lead to “progressive” research programs, where a small change accommodates a large swath of past and future data. And the ones we abandon are those that lead to “degenerate” ones, where the theory gets patched and re-patched at the same rate as new facts come in.

Another problem with the falsifiability criterion is that I have seen it  become a blunt instrument, unthinkingly applied. Popper tried to use it to discredit not only Marxism and Freudianism as scientific theories but also Darwin’s theory of natural selection—a position that only a creationist could hold today. I have seen scientists claim that major theories in contemporary cosmology and physics are not “science” because they can’t think of a simple test that would falsify them. You’d think that when they are faced with a conflict between what scientists really do and their memorized Popperian sound-bite about how science ought to be done, they might question the sound bite, and go back and learn more than a single sentence from the philosophy of science. But such is the godlike authority of Popper that his is the one theory that can never be falsified!

Finally, I’ve come to think that identifying scientificality with falsifiability lets certain non-scientific theories off the hook, by saying that we should try to find good reasons to believe whether a theory is true or false only when that theory is called “science.” It allows believers to protect their pet theories by saying that they can’t be, and shouldn’t be, subject to falsification, just because they’re clearly not scientific theories. Take the theory that there’s an omnipotent, omniscient, beneficent God. It may not be a scientific hypothesis, but it seems to me to be eminently falsifiable; in fact, it seems to have been amply falsified.   But because falsifiability is seen as demarcating the scientific, and since theism is so clearly not scientific, believers in religious ideologies get a free pass. The same is true for many political ideologies. The parity between scientific and nonscientific ideas is concealed by thinking that there’s a simple test that distinguishes science from nonscience, and that that test is falsifiability.


EDUARDO PUNSET
Scientist; Spanish Television Presenter; Author, The Happiness Trip

The soul is in the brain

All right, we knew it. But now we have the whole picture of the molecular process through which past and future link; how the germinal soul, rooted in brain matter and memory, allows for new perceptions, for the future, to emerge. It is both simple and terrifying at the same time.

When the mind is challenged from the outside universe, it searches in its accumulated archives in order to make sense of this new stimulus. This screening of our memory –of our past- produces an immediate answer: the new stimulus either leaves everyone indifferent, or else it blooms into an emotion of love, of pleasure or of sheer curiosity. These are the three touchstones of creativity. So basically, science has discovered that at the very beginning at least, only the past matters. And that holds true also of our future creativity.

Then a process more akin to alchemy than science sparks off and develops into social intelligence. The imitation process, based on mirror neurons, interacts with the corpus of accumulated knowledge -of one´s own species, and of others- which, combined with a good stock of well preserved individual memory, explode into new thinking.

Until very recently, we were missing a fundamental step in the process of knowledge- namely, how to transform short term memory into long term knowledge. At last we are taking into account the detailed contents of durability, specific proteins without which there is no learning and affection in childhood, no schooling at a later stage, no socialization in adult life. The roots are in the past; but there is no knowledge if we hide in a cave alone, with no windows to peer from and no shadows dancing outside.

The past has to be worked upon from the outside in order to transform into the future, and this has brought about the second main discovery in the molecular process of creativity. The so called “technology transfer” from old to new generations is a two-way process: matter, mind, soul, past, memory, future, and also startling new ways of looking at old things, are all marvellously intertwined in the evolutionary process.


JOHN ALLEN PAULOS
Professor of Mathematics, Temple University, Philadelphia; Author, Irreligion: A Mathematician Explains Why the Arguments ofr God Just Don't Add Up

The Convergence of Belief Change

I've changed my mind about countless matters, but most if not all such changes have been vaguely akin to switching from brand A to brand B. In some deep sense, however, I feel that I've never really changed how I think about or evaluate things. This may sound like a severe case of cerebral stenosis, but I think the condition is universal. Although people change their minds, they do so in an invariant, convergent sort of way, and I find this to be of more interest than the brand switches, important as they sometimes are.

I take heart that this stance can be viewed as something other than stubborn rigidity from the so-called Agreement Theorem of Nobel Prize-winning game-theorist Robert Aumann. His theorem can be roughly paraphrased as follows: Two individuals cannot forever agree to disagree.

An important definition allows for a slightly fuller statement. Information is termed "common knowledge" among a group of people if all parties know it, know that the others know it, know that the others know they know it, and so on. It is much more than "mutual knowledge," which requires only that the parties know the particular bit of information, not that they be aware of the others' knowledge.

Aumann showed that as agents' beliefs, formed in rational response to different bits of private information, gradually become common knowledge, their beliefs change and eventually coincide.

Thus in whatever rational ways each of us comes to change his or her mind, in the long run the rest of us will follow suit. Of course, as Keynes observed, in the long run, we're all dead.

Another problem is Aumann's result doesn't say anything about the convergence of irrational agents.


LEO CHALUPA
Ophthalmologist and Neurobiologist, University of California, Davis

Brain Plasticity

The hottest topic in neuroscience today is brain plasticity. This catchphrase refers to the fact that various types of experience can significantly modify key attributes of the brain. This field began decades ago by focusing on how different aspects of the developing brain could be impacted by early rearing conditions. 

More recently, the field of brain plasticity has shifted to studies demonstrating a remarkable degree of change in the connections and functional properties of mature and even aged brains. Thousands of published papers have now appeared on this topic, many by reputable scientists, and this has lead to a host of books, programs and even commercial enterprises touting the malleability of the brain with “proper” training. One is practically made to feel guilty for not taking advantage of this thriving store of information to improve one’s own brain or those of one’s children and grandchildren. 

My field of research is developmental neurobiology and I used to be a proponent of the potential benefits documented by brain plasticity studies. I am still of the opinion that brain plasticity is a real phenomenon, one that deserves further study and one that could be utilized to better human welfare. But my careful reading of this literature has tempered my initial enthusiasm.

For one thing, those selling a commercial product are making many of the major claims for the benefits of brain exercise regimes. It is also the case that my experiences outside the laboratory have caused me to question the limitless potential of brain plasticity advocated by some devotees.

Point in fact: Recently I had the chance to meet someone I had not seen since childhood. The person had changed physically beyond all recognition, as might be expected.  Yet after spending some time with this individual, his personality traits of long ago became apparent, including a rather peculiar laugh I remember from grade school.

Point of fact: a close colleague had a near fatal car accident, one that caused him to be in a coma for many days and in intensive care for weeks thereafter. Shortly after returning from his ordeal, this A-type personality changed into a seemingly mellow and serene person. But in less than two months, even before the physical scars of his accident had healed, he was back to his old driven self. 

For a working scientist to invoke anecdotal experience to question a scientific field of endeavor is akin to heresy. But it seems to me that it is foolish to simply ignore what one has learned from a lifetime of experiences. The older I get the more my personal interactions convince me that a person’s core remains remarkably stable in spite of huge experiential variations. With all the recent emphasis on brain plasticity, there has been virtually no attempt to explain the stability of the individual’s core attributes, values and beliefs. 

Here is a real puzzle to ponder: Every cell in your body, including all 100 billion neurons in your brain is in a constant process of breakdown and renewal. Your brain is different than the one you had a year or even a month ago, even without special brain exercises. So how is the constancy of one’s persona maintained?   The answer to that question offers a far greater challenge to our understanding of the brain than the currently in vogue field of brain plasticity.


SCOTT ATRAN
Anthropologist, University of Michigan; Author, In Gods We Trust

The Religious Politics of Fictive Kinship

I am an anthropologist who has traveled to many places and met many different kinds of people. I try to know what it is like to be someone very different from me in order to better understand what it means to be human. But it is only in the last few years that my thinking has deeply changed on what drives major differences between animal and human behavior, such as willingness to kill and die for a cause.

I once thought that individual cognition and personality, influences from broad socio-economic factors, and degree of devotion to religious or political ideology were determinant. Now I see friendship and others aspects of small group dynamics, especially acting together, trumping most everything else.

Here's an anecdote that kick-started me thinking about this.

While preparing a psychological experiment on limits of rational choice with Muslim mujahedin on the Indonesian Island of Sulawesi, I noticed tears welling in my traveling companion and bodyguard, Farhin (who had earlier hosted 9-11 mastermind Khalid Sheikh Muhammed in Jakarta and helped to blow up the Philippines' ambassador's residence). Farhin had just heard of a young man recently been killed in a skirmish with Christian fighters.

"Farhin," I asked, "you knew the boy?"

"No," he said, "but he was only in the jihad a few weeks. I've been fighting since Afghanistan [late 1980s] and still not a martyr."

I tried consoling with my own disbelief, "But you love your wife and children."

"Yes," he nodded sadly, "God has given this, and I must have faith in His way."

I had come to the limits of my understanding of the other. There was something in Farhin that was incalculably different from me yet almost everything else was not.

"Farhin, in all those years, after you and the others came back from Afghanistan, how did you stay a part of the Jihad?" I asked.

I expected him to tell me about his religious fervor and devotion to a Great Cause.

"The (Indonesian) Afghan Alumni never stopped playing soccer together," he replied matter-of-factly, "that's when we were closest together in the camp." He smiled, "except when we went on vacation to fight the communists, we played soccer and remained brothers."

Maybe people don't kill and die simply for a cause. They do it for friends — campmates, schoolmates, workmates, soccer buddies, body-building buddies, pin-ball buddies — who share a cause. Some die for dreams of jihad — of justice and glory — but nearly all in devotion to a family-like group of friends and mentors, of "fictive kin."

Then it became embarrassingly obvious: it is no accident that nearly all religious and political movements express allegiance through the idiom of the family — Brothers and Sisters, Children of God, Fatherland, Motherland, Homeland, and the like. Nearly all such movements require subordination, or at least assimilation, of any real family (genetic kinship) to the larger imagined community of "Brothers and Sisters." Indeed, the complete subordination of biological loyalty to ideological loyalty for the Ikhwan, the "Brotherhood" of the Prophet, is Islam's original meaning, "Submission."

My research team has analyzed every attack by Farhin and his friends, who belong to Southeast Asia's Jemmah Islamiyah (JI). I have interviewed key JI operatives (including co-founder, Abu Bakr Ba'asyir) and counterterrorism officials who track JI. Our data show that support for suicide actions is triggered by moral outrage at perceived attacks against Islam and sacred values, but this is converted to action as a result of small world factors. Out of millions who express sympathy with global jihad, only a few thousand show willingness to commit violence. They tend to go to violence in small groups consisting mostly of friends, and some kin. These groups arise within specific "scenes": neighborhoods, schools (classes, dorms), workplaces and common leisure activities (soccer, mosque, barbershop, café, online chat-rooms).

Three other examples:

1. In Al Qaeda, about 70 percent join with friends, 20 percent with kin. Our interviews with friends of the 9/11 suicide pilots reveal they weren't "recruited" into Qaeda. They were Middle Eastern Arabs isolated in a Moroccan Islamic community in a Hamburg suburb. Seeking friendship, they started hanging out after mosque services, in local restaurants and barbershops, eventually living together when they self-radicalized. They wanted to go to Chechnya, then Kosovo, only landing in a Qaeda camp in Afghanistan as a distant third choice.

2. Five of the seven plotters in the 2004 Madrid train bombing who blew themselves up when cornered by police grew up in the tumble-down neighborhood of Jemaa Mezuaq in Tetuan, Morocco. In 2006, at least five more young Mezuaq men went to Iraq on "martyrdom missions." One in the Madrid group was related to one in the Iraq group by marriage; each group included a pair of brothers. All went to the same elementary school, all but one to the same high school. They played soccer as friends, went to the same mosque, mingled in the same cafes.

3. Hamas's most sustained suicide bombing campaign in 2003 (Hamas suspended bombings in 2004) involved seven soccer buddies from Hebron's Abu Katila neighborhood, including four kin (Kawasmeh clan).

Social psychology tends to support the finding that "groupthink" often trumps individual volition and knowledge, whether in our society or any other. But for Americans bred on a constant diet of individualism the group is not where one generally looks for explanation. This was particularly true for me, but the data caused me to change my mind.


MARCO IACOBONI
Neuroscientist, UCLA Brain Mapping Center; Author, Mirroring People

The eradication of irrational thinking is (not) inevitable (it will require some serious work)

Some time ago I thought that rational, enlightened thinking would eventually eradicate irrational thinking and supernatural beliefs. How could it be otherwise? Scientists and enlightened people have facts and logical arguments on their side, whereas people 'on the other side' have only unprovable beliefs and bad reasoning. I guess I was wrong, way wrong. Thirty years later, irrational thinking and supernatural beliefs are much stronger than they used to be, permeate ours and other societies and it does not seem they will go away any time soon. How is it possible? Shouldn't 'history' always move forward? What went wrong? What can we do to fix this backward movement toward the irrational?

The problem is that science has still a marginal role in our public discourse. Indeed, there are no science books on the New York Times 100 Notable Books of the Year list, no science category in the Economist Books of the Year 2007 and only Oliver Sacks in the New Yorker's list of Books From Our Pages.

Why does science have such a marginal role? I think there is more than one reason. First, scientists tend to confine themselves in well-defined, narrow boundaries. They tend not to claim any wisdom outside the confines of their specialties. By doing so, they marginalize themselves and make it difficult for science to have an impact on their society. It is high time for scientists to step up and claim wisdom outside their specialty.

There are also other ways, however, to have an impact on our society. For instance, by making some changes in scientific practices. In these days, scientific practices are dominated by the 'hypothesis testing' paradigm. While there is nothing wrong with hypothesis testing, it is definitely wrong to confine all science only to hypothesis testing. This approach precludes the study of complex, real world phenomena, the phenomena that are important to people outside academia. It is time to perform more broad-based descriptive studies on issues that are highly relevant to our society.

Another dominant practice in science (definitely in neuroscience, my own field) is to study phenomena from an atemporal perspective. Only the timeless seems to matter to most neuroscientists. Even time itself tends to be studied from this 'platonic ideal' perspective. I guess this approach stems from the general tendency of science to adopt the detached 'view from nowhere,' as Thomas Nagel puts is. If there is one major thing we have learned from modern science, however, is that there is no such thing, there is no 'view from nowhere.' It is time for scientists (especially neuroscientists) to commit to the study of the finite and temporal. The issues that matter 'here and now' are the issues that people relate to.

How should we do all this? One way of disseminating the scientific method in our public discourse is to use the tools and approaches of science to investigate issues that are salient to the general public. In neuroscience, we have now powerful tools that let us do this. We can study how people make decisions and form affiliations not from a timeless perspective, but from the perspective of what is salient 'here and now.' These are the kind of studies that naturally engage people. While they read about these studies, people are more likely to learn scientific facts (even the 'atemporal' ones) and to absorb the scientific method and reasoning. My hope is that by being exposed to and engaged by scientific facts, methods, and reasoning, people will eventually find it difficult to believe unprovable things.


RICHARD WRANGHAM
Professor of Biology and Anthropology, Harvard University' Coauthor (with Dale Peterson), Demonic Males: Apes, and the Origins Of Human Violence

The Human Recipe

Like people since even before Darwin, I used to think that human origins were explained by meat-eating. But three epiphanies have changed my mind.  I now think that cooking was the major advance that made us human.

First, an improved fossil record has shown that meat-eating arose too early to explain human origins. Significant meat-eating by our ancestors is initially attested in the pre-human world of 2.6 million years ago, when hominids began to flake stones into simple knives. Around the same time there appears a fossil species variously called Australopithecus habilis or  Homo habilis. These habilis presumably made the stone knives, but they were not human. They were Calibans, missing links with intricate mixture of advanced and primitive traits. Their brains, being twice the size of ape brains, tell of incipient humanity: but as Bernard Wood has stressed, their chimpanzee-sized bodies, long arms, big guts and jutting faces made them ape-like. Meat-eating likely explains the origin of habilis.

Humans emerged almost a million years later when habilis evolved into Homo erectus. At 1.6 million years ago Homo erectus were the size and shape of people today. Their brains were bigger than those of habilis, and they walked and ran as fluently as we do. Their mouths were small and their teeth relatively dwarfed — a pygmy-faced hominoid, just like all later humans. To judge from the reduced flaring of their rib-cage they had lost the capacious guts that allow great apes and habilis to eat large volumes of plant food. Equally strange for a “helpless and defenceless” species they had also lost their climbing ability, forcing them to sleep on the ground — a surprising commitment in a continent full of big cats, sabretooths, hyenas, rhinos and elephants.

So the question of what made us human is the question of why a population of habilis became Homo erectus. My second epiphany was a double insight: humans are biologically adapted to eating cooked diets, and the signs of this adaptation start with Homo erectus. Cooked food is the signature feature of human diet. It not only makes our food safe and easy to eat, but it also grants us large amounts of energy compared to a raw diet, obviating the need to ingest big meals. Cooking softens food too, thereby making eating so speedy that as eaters of cooked food, we are granted many extra hours of free time every day.

So cooked food allows our guts, teeth and mouths to be small, while giving us abundant food energy and freeing our time. Cooked food, of course, requires the control of fire; and a fire at night explains how Homo erectus dared sleep on the ground.

 Cooked food has so many important biological effects that its adoption should be clearly marked in the fossil record by signals of a reduced digestive system and increased energy use. While such signs are clear at the origin of Homo erectus, they are not found later in human evolution. The match between the biological merits of cooked food and the evolutionary changes in Homo erectus is thus so obvious that except for a scientific obstacle, I believe it would have been noticed long ago. The obstacle is the insistence of archaeologists that the control of fire is not firmly evidenced before about a quarter of a million years ago. As a result of this archaeological caution, the idea that humans could have used fire before about 250,000 to 500,000 years ago has long been sidelined.

But I finally realized that the archaeological record decays so steadily that it gives us no information about when fire was first controlled. The fire record is better at 10,000 years than at 20,000 years; at 50,000 years than 100,000 years; at 250,000 years than 500,000 years; and so on. Evidence for the control of fire is always better when it is closer to the present, but during the course of human evolution it never completely goes away. There is only one date beyond which no evidence for the control of fire has been found: 1.6 million years ago, around the time when Homo erectus evolved. Between now and then, the erratic record tells us only one thing: the archaeological evidence is incapable of telling us when fire was first controlled. The biological evidence is more helpful. That was my third epiphany.

The origin of Homo erectus is too late for meat-eating; the adoption of cooking solves the problem; and archaeology does not gainsay it. In a roast potato and a hunk of beef we have a new theory of what made us human.


SEAN CARROLL
Theoretical Physicist, Cal Tech

Being a Heretic is Hard Work

Growing up as a young proto-scientist, I was always strongly anti-establishmentarian, looking forward to overthrowing the System as our generation's new Galileo.  Now I spend a substantial fraction of my time explaining and defending the status quo to outsiders.  It's very depressing.

As an undergraduate astronomy I was involved in a novel and exciting test of Einstein's general relativity — measuring the precession of orbits, just like Mercury in the Solar System, but using massive eclipsing binary stars.  What made it truly exciting was that the data disagreed with the theory!  (Which they still do, by the way.)  How thrilling is it to have the chance to overthrow Einstein himself?  Of course there are more mundane explanations — the stars are tilted, or there is an invisible companion star perturbing their orbits, and these hypotheses were duly considered.  But I wasn't very patient with such boring possibilities — it was obvious to me that we had dealt a crushing blow to a cornerstone of modern physics, and the Establishment was just too hidebound to admit it.

Now I know better.  Physicists who are experts in the field tend to be skeptical of experimental claims that contradict general relativity, not because they are hopelessly encumbered by tradition, but because Einstein's theory has passed a startlingly diverse array of experimental tests.  Indeed, it turns out to be almost impossible to change general relativity in a way that would be important for those binary stars, but which would not have already shown up in the Solar System.  Experiments and theories don't exist in isolation — they form a tightly connected web, in which changes to any one piece tend to reverberate through various others.

So now I find myself cast as a defender of scientific orthodoxy — from classics like relativity and natural selection, to modern wrinkles like dark matter and dark energy.  In science, no orthodoxy is sacred, or above question — there should always be a healthy exploration of alternatives, and I have always enjoyed inventing new theories of gravity or cosmology, keeping in mind the variety of evidence in favor of the standard picture.  But there is also an unhealthy brand of skepticism, proceeding from ignorance rather than expertise, which insists that any consensus must flow from a reluctance to face up to the truth, rather than an appreciation of the evidence.  It's that kind of skepticism that keeps showing up in my email.  Unsolicited.

Heresy is more romantic than orthodoxy.  Nobody roots for Goliath, as Wilt Chamberlain was fond of saying.  But in science, ideas tend to grow into orthodoxy for good reasons.  They fit the data better than the alternatives.  Many casual heretics can't be bothered with all the detailed theoretical arguments and experimental tests that support the models they hope to overthrow — they have a feeling about how the universe should work, and are convinced that history will eventually vindicate them, just as it did Galileo.

What they fail to appreciate is that, scientifically speaking, Galileo overthrew the system from within.  He understood the reigning orthodoxy of his time better than anyone, so he was better able to see beyond it.  Our present theories are not complete, and nobody believes they are the final word on how Nature works.  But finding the precise way to make progress, to pinpoint the subtle shift of perspective that will illuminate a new way of looking at the world, will require an intimate familiarity with our current ideas, and a respectful appreciation of the evidence supporting them. 

Being a heretic can be fun; but being a successful heretic is mostly hard work.


< previous

| Index | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 | 13 | 14 | 15 | 16 | 17 |

next >


|Top|