WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?

WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?

Jared Diamond [4.26.03]

What I'm going to suggest is a road map of factors in failures of group decision making. I'll divide the answers into a sequence of four somewhat fuzzily delineated categories. First of all, a group may fail to anticipate a problem before the problem actually arrives. Secondly, when the problem arrives, the group may fail to perceive the problem. Then, after they perceive the problem, they may fail even to try to solve the problem. Finally, they may try to solve it but may fail in their attempts to do so. While all this talking about reasons for failure and collapses of society may seem pessimistic, the flip side is optimistic: namely, successful decision-making. Perhaps if we understand the reasons why groups make bad decisions, we can use that knowledge as a check list to help groups make good decisions.

Honoring The Scientist As Poet
Lewis Thomas Prize Lecture
The Rockefeller Institute, New York City
Thursday March 27, 2003

video

Introduction

At the end of March, Jared Diamond was in New York to receive THE LEWIS THOMAS PRIZE Honoring the Scientist as Poet. The prize was presented to Jared by Thomas P. Sakmar, Acting President, The Rockefeller University.

"Throughout history," states the LEWIS THOMAS PRIZE literature, "scientists and poets have sought to unveil the secrets of the natural world. Their methods vary: scientists use tools of rational analysis to slake their compelling thirst for knowledge; poets delve below the surface of language, and deliver urgent communiqués from its depths. The Lewis Thomas Prize honors the rare individual who is fluent in the dialects of both realms — and who succeeds in spinning lush literary and philosophical tapestries from the silken threads of scientific and natural phenomena — providing not merely new information but cause for reflection, even revelation."

"The Lewis Thomas Prize was established in 1993 by the trustees of The Rockefeller University and presented to Lewis Thomas, its first recipient, that year. Other recipients have been François Jacob (1994), Abraham Pais (1995), Freeman Dyson (1996), Max Perutz (1997), Ernst Mayr (1998), Steven Weinberg (1999), Edward O. Wilson (2000), and Oliver Sacks (2001)."

~~~

Jared is an early and frequent contributor to Edge. In his first feature in 1997 ("Why Did Human History Unfold Differently On Different Continents For The Last 13,000 Years?") he stated:

"I've set myself the modest task of trying to explain the broad pattern of human history, on all the continents, for the last 13,000 years. Why did history take such different evolutionary courses for peoples of different continents? This problem has fascinated me for a long time, but it's now ripe for a new synthesis because of recent advances in many fields seemingly remote from history, including molecular biology, plant and animal genetics and biogeography, archaeology, and linguistics."

Underlying his task is the question of how to turn the study of history into a science. He notes the distinction between the "hard sciences" such as physics, biology, and astronomy — and what we sometimes call the "social sciences," which includes history, economics, government. The social sciences are often thought of as a pejorative. In particular many of the so-called hard scientists such as physicists or biologists, don't consider history to be a science. The situation is even more extreme because, he points out, even historians themselves don't consider history to be a science. Historians don't get training in the scientific methods; they don't get training in statistics; they don't get training in the experimental method or problems of doing experiments on historical subjects; and they'll often say that history is not a science, history is closer to an art.

He comes to this question as one who is accomplished in two scientific areas: physiology and evolutionary biology. The first is a laboratory science; the second, is never far from history. "Biology is the science," he says. "Evolution is the concept that makes biology unique." He continues to bring together history and biology in new and interesting ways to present global accounts of the rise and fall of civilizations.

More than one million copies of the U.S. edition of Jared Diamond's Pulitzer Prize winning Guns, Germs, and Steel:The Fates of Human Societies have now been sold. Jared hopes to deliver his much-anticipated new book, Ecocide, at the end of this year for publication in 2004.

Following the Prize Presentation, Jared delivered the Lewis Thomas Prize Lecture "Why Do Some Societies Make Disastrous Decisions?" The next morning, he stopped by to videotape a reprise of the opening of his talk which Edge is pleased to present as a streaming video along with the text of his lecture.

JB

JARED DIAMOND is Professor of Geography at the University of California, Los Angeles. Until recently he was Professor of Physiology at the UCLA School of Medicine. He is the Pulitzer Prize-winning author of the widely acclaimed Guns, Germs, and Steel: the Fates of Human Societies, which also is the winner of Britain's 1998 Rhone-Poulenc Science Book Prize.

He is also the author of two other trade books: The Third Chimpanzee, which won The Los Angeles Times Book award for the best science book of 1992 and Britain's 1992 Rhone-Poulenc Science Book Prize; and Why is Sex Fun? (ScienceMasters Series).

Dr. Diamond is the recipient of a MacArthur Foundation Fellowship ("Genius Award"); research prizes of the American Physiological Society, National Geographic Society, and Zoological Society of San Diego; and many teaching awards and endowed public lectureships. In addition, he has been elected a member of all three of the leading national scientific/academic honorary societies (National Academy of Sciences, American Academy of Arts and Sciences, American Philosophical Society).

His field experience includes 17 expeditions to New Guinea and neighboring islands, to study ecology and evolution of birds; rediscovery of New Guinea's long-lost goldenfronted bowerbird; other field projects in North America, South America, Africa, Asia, and Australia. As a conservationist he devised a comprehensive plan, almost all of which was subsequently implemented, for Indonesian New Guinea's national park system; numerous field projects for the Indonesian government and World Wildlife Fund; founding member of the board of the Society of Conservation Biology; member of the Board of Directors of World Wildlife Fund/USA.

Jared Diamond 's Edge Bio Page 


 

WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?

[JARED DIAMOND:] Education is supposed to be about teachers imparting knowledge to students. As every teacher knows, though, if you have a good group of students, education is also about students imparting knowledge to their supposed teachers and challenging their assumptions. That's an experience that I've been through in the last couple of months, when for the first time in my academic career I gave a course to undergraduates, highly motivated UCLA undergraduates, on collapses of societies. Why is it that some societies in the past have collapsed while others have not? I was discussing famous collapses such as those of the Anasazi in the U.S. Southwest, Classic Maya civilization in the Yucatan, Easter Island society in the Pacific, Angkor Wat in southeast Asia, Great Zimbabwe in Africa, Fertile Crescent societies, and Harappan Indus Valley societies. These are all societies that we've realized, from archaeological discoveries in the last 20 years, hammered away at their own environments and destroyed themselves in part by undermining the environmental resources on which they depended.

For example, the Easter Islanders, Polynesian people, settled an island that was originally forested, and whose forests included the world's largest palm tree. The Easter Islanders gradually chopped down that forest to use the wood for canoes, firewood, transporting statues, raising statues, and carving and also to protect against soil erosion. Eventually they chopped down all the forests to the point where all the tree species were extinct, which meant that they ran out of canoes, they could no longer erect statues, there were no longer trees to protect the topsoil against erosion, and their society collapsed in an epidemic of cannibalism that left 90 percent of the islanders dead. The question that most intrigued my UCLA students was one that hadn't registered on me: how on Earth could a society make such an obviously disastrous decision as to cut down all the trees on which they depended? For example, my students wondered, what did the Easter Islanders say as they were cutting down the last palm tree? Were they saying, think of our jobs as loggers, not these trees? Were they saying, respect my private property rights? Surely the Easter Islanders, of all people, must have realized the consequences to them of destroying their own forest. It wasn't a subtle mistake. One wonders whether — if there are still people left alive a hundred years from now — people in the next century will be equally astonished about our blindness today as we are today about the blindness of the Easter Islanders.

This question, why societies make disastrous decisions and destroy themselves, is one that not only surprised my UCLA undergraduates, but also astonishes professional historians studying collapses of past societies. The most cited book on the subject of the collapse of societies is by the historian, Joseph Tainter. It's entitled The Collapse of Complex Societies. Joseph Tainter, in discussing ancient collapses, rejected the possibility that those collapses might be due to environmental management because it seemed so unlikely to him. Here's what Joseph Tainter said: "As it becomes apparent to the members or administrators of a complex society that a resource base is deteriorating, it seems most reasonable to assume that some rational steps are taken towards a resolution. With their administrative structure and their capacity to allocate labor and resources, dealing with adverse environmental conditions may be one of the things that complex societies do best. It is curious that they would collapse when faced with precisely those conditions that they are equipped to circumvent." Joseph Tainter concluded that the collapses of all these ancient societies couldn't possibly be due to environmental mismanagement, because they would never make these bad mistakes. Yet it's now clear that they did make these bad mistakes.

My UCLA undergraduates, and Joseph Tainter as well, have identified a very surprising question; namely, failures of group decision-making on the part of whole societies, or governments, or smaller groups, or businesses, or university academic departments. The question of failure of group decision-making is similar to questions of failures of individual decision-making. Individuals make bad decisions; they enter bad marriages, they make bad investments, their businesses fail. But in failures of group decision-making there are some additional factors, notably conflicts of interest among the members of the group that don't arise with failures of individual decision-making. This is obviously a complex question; there's no single answer to it. There are no agreed-on answers.

What I'm going to suggest is a road map of factors in failures of group decision making. I'll divide the answers into a sequence of four somewhat fuzzily delineated categories. First of all, a group may fail to anticipate a problem before the problem actually arrives. Secondly, when the problem arrives, the group may fail to perceive the problem. Then, after they perceive the problem, they may fail even to try to solve the problem. Finally, they may try to solve it but may fail in their attempts to do so. While all this talking about reasons for failure and collapses of society may seem pessimistic, the flip side is optimistic: namely, successful decision-making. Perhaps if we understand the reasons why groups make bad decisions, we can use that knowledge as a check list to help groups make good decisions.

The first item on my road map is that groups may do disastrous things because they didn't anticipate a problem before it arrived. There may be several reasons for failure to anticipate a problem. One is that they may have had no prior experience of such problems, and so may not have been sensitized to the possibility. For example, consider forest fires in the U.S. West. My wife, my children and I spend parts of our summers in Montana, and each year when we fly into Montana I look out our plane window as our plane is coming in to see how many forest fires I see out there today. Forest fires are a major problem not only in Montana, but throughout the U.S. Intermontane West in general. Forest fires on that giant scale are unknown in the eastern United States and in Europe. When settlers from the eastern United States and Europe arrived in Montana and a forest fire arose, their reaction was, of course, that you should try to put out the fire. The motto of the U.S. Forest Service for nearly a century was: our goal is that every forest fire will be put out by 10:00 AM of the next morning after the day on which it has been reported. That attitude of easterners and Europeans about forest fires was because they had had no previous experience of forest fires in a dry environment where there's a big buildup of fuel, where trees that fall down into the understory don't rot away as in wet Europe and as in the wet eastern United States, but accumulate there in a dry environment. lt turns out that frequent small fires burn off the fuel load, and if you suppress those frequent small fires, then when eventually a fire is lit it may burn out of control far beyond one's ability to suppress it, resulting in the big disastrous fires in the U.S. Intermontane West. It turns out that the best way to deal with forest fires in the West is to let them burn, and burn out, and then there won't be a buildup of a fuel load resulting in a disaster. But these huge forest fires were something with which eastern Americans and Europeans had no prior experience. The idea that you should let a fire burn, and destroy valuable forest, was so counter-intuitive that it took the U.S. Forest Service a hundred years to realize the problem and to change the strategy and let the fire burn. So here's an example of how a society with no prior experience of a problem may not even recognize the problem — the problem of fuel loads in the understory of a dry forest.

That's not the only reason, though, why a society may fail to anticipate a problem before it actually arises. Another reason is that they may have had prior experience but that prior experience has been forgotten. For example, a non literate society is not going to preserve oral memories of something that happened long in the past. The Classic Lowland Maya eventually succumbed to a drought around 800 A.D. There had been previous droughts in the Maya realm, but they could not draw on that prior experience, because although the Maya had some writing, it just preserved the conquests of kings and didn't record droughts. Maya droughts recur at intervals of 208 years, so the Maya in 800 A.D., when the big drought struck, did not and could not remember the drought of A.D. 592.

In modern literate societies, even though we do have writing, that does not necessarily mean that we can draw on our prior experience. We, too, tend to forget things, and so for example Americans recently behave as if they've forgotten about the 1973 Gulf oil crisis. For a year or two after the crisis they avoided gas-guzzling vehicles, then quickly they forgot that knowledge, despite their having writing. And again in the 1960s the city of Tucson, Arizona went through a severe drought, and the citizens swore that they would manage their water better after that, but within a decade or two Tucson was going back to its water-guzzling ways of building golf courses and watering one's gardens. So there we have a couple of reasons why a society may fail to anticipate a problem before it has arrived.

The remaining reason why a society may fail to anticipate a problem before it develops involves reasoning by false analogy. When we are in an unfamiliar situation, we fall back on reasoning by analogy with old familiar situations. That's a good way to proceed if the old and new situations are truly analogous, but reasoning by analogy can be dangerous if the old and new situations are only superficially similar.

An example of a society that suffered from disastrous consequences of reasoning by false analogy was the society of Norwegian Vikings who immigrated to Iceland beginning in the year AD 871. Their familiar homeland of Norway has heavy clay soils ground up by glaciers. Those soils are sufficiently heavy that, if the vegetation covering them is cut down, they are too heavy to be blown away. Unfortunately for the Viking colonists of Iceland, Icelandic soils are as light as talcum powder. They arose not through glacial grinding, but through winds carrying light ashes blown out in volcanic eruptions. The Vikings cleared the forests over those soils in order to create pasture for their animals. Unfortunately, the ash that was light enough for the wind to blow in was light enough for the wind to blow out again when the covering vegetation had been removed. Within a few generations of the Vikings' arriving in Iceland, half of Iceland's top soil had eroded into the ocean. Other examples of reasoning by false analogy abound.

The second step in my road map, after a society has anticipated or failed to anticipate a problem before it arises, involves a society's failing to perceive a problem that has actually arrived. There are at least three reasons for such failures, all of them common in the business world and in academia. First, the origins of some problems are literally imperceptible. For example, the nutrients responsible for soil fertility are invisible to the eye, and only in modem times measurable by means of chemical analysis. In Australia, Mangareva, parts of the U.S. Southwest, and many other locations, most of the nutrients had already been leached out of the soil by rainfall. When people arrived and began growing crops, those crops quickly exhausted the remaining nutrients, so that agriculture rapidly failed. Yet such nutrient-poor soils often bear lush-appearing vegetation; it's just that most of the nutrients in the ecosystem are contained in the vegetation rather than in the soil, so that the nutrients are removed when one cuts down the vegetation. There was no way that the first colonists of Australia and Mangareva could perceive that problem of soil nutrient exhaustion.

An even commoner reason for a society's failing to perceive a problem is that the problem may take the form of a slow trend concealed by wide up-and-down fluctuations. The prime example in modern times is global warming. We now realize that temperatures around the world have been slowly rising in recent decades, due in large part to changes in the atmosphere caused by humans. However, it is not the case that the climate each year is inexorably 0.17 degrees warmer than in the previous year. Instead, as we all know, climate fluctuates up and down erratically from year to year: three degrees warmer in one summer than the previous summer, then two degrees warmer the next summer, down four degrees the following summer, down another degree the next summer, then up five degrees, etc. With such wide and unpredictable fluctuations, it takes a long time to discern the upwards trend within that noisy signal. That's why it was only a few years ago that the last professional climatologist previously skeptical of the reality of global warming became convinced. Our president is still not convinced of the reality of global warming, and he thinks that we need more research. The medieval Greenlanders had similar difficulties in recognizing that the climate was gradually becoming colder, and the Maya of the Yucatan had difficulties discerning that the climate was gradually becoming drier.

Politicians use the term "creeping normalcy" to refer to such slow trends concealed within noisy fluctuations. If a situation is getting worse only slowly, it is difficult to recognize that this year is worse than last year, and each successive year is only slightly worse than the year before, so that one's baseline standard for what constitutes "normalcy" shifts only gradually and almost imperceptibly. lt may take a few decades of a long sequence of such slight year-to-year changes before someone suddenly realizes that conditions were much better several decades ago, and that what is accepted as normalcy has crept downwards.

The remaining frequent reason for failure to perceive a problem after it has arrived is distant managers, a potential problem in any large society. For example, today the largest private landowner and the largest timber company in the state of Montana is based not within the state but in Seattle, Washington. Not being on the scene, company executives may not realize that they have a big weed problem on their forest property.

All of us who belong to other groups can think of examples of imperceptibly arising problems, creeping normalcy, and distant managers.

The third step in my road map of failure is perhaps the commonest and most surprising one: a society's failure even to try to solve a problem that it has perceived.

Such failures frequently arise because of what economists term "rational behavior" arising from clashes of interest between people. Some people may reason correctly that they can advance their own interests by behavior that is harmful for other people. Economists term such behavior "rational," even while acknowledging that morally it may be naughty. The perpetrators are often motivated and likely to get away with their rational bad behavior, because the winners from the bad status quo are typically concentrated (few in number) and highly motivated because they receive big, certain, immediate profits, while the losers are diffuse (the losses are spread over large numbers of individuals) and are unmotivated because they receive only small, uncertain, distant profits from undoing the rational bad behavior of the minority.

A typical example of rational bad behavior is "good for me, bad for you and for the rest of society" — to put it bluntly, "selfishness." A few individuals may correctly perceive their self-interests to be opposed to the majority's self-interest. For example, until 1971, mining companies in Montana typically just dumped their toxic wastes of copper and arsenic directly into rivers and ponds because the state of Montana had no law requiring mining companies to clean up after abandoning a mine. After 1971, the state of Montana did pass such a law, but mining companies discovered that they could extract the valuable ore and then just declare bankruptcy before going to the expense of cleaning up. The result has been billions of dollars of clean-up costs borne by the citizens of the United States or Montana. The miners had correctly perceived that they could advance their interests and save money by making messes and leaving the burden to society.

One particular form of such clashes of interest has received the name "tragedy of commons." That refers to a situation in which many consumers are harvesting a communally owned resource (such as fish in the ocean, or grass in common pastures), and in which there is no effective regulation of how much of the resource each consumer can draw off. Under those circumstances, each consumer can correctly reason "If I don't catch that fish or graze that grass, some other fisherman or herder will anyway, so it makes no sense for me to be careful about overfishing or overharvesting." The correct rational behavior is to harvest before the next consumer can, even though the end result is depletion or extinction of the resource, and hence harm for society as a whole.

Rational behavior involving clashes of interest also arises when the consumer has no long-term stake in preserving the resource. For example, much commercial harvesting of tropical rainforests today is carried out by international logging companies, which lease land in one country, cut down all the rainforest in that country, and then move on to the next country. The international loggers have correctly perceived that, once they have paid for the lease, their interests are best served by clear-cutting the rainforest on their leased land. In that way, loggers have destroyed most of the forest of the Malay Peninsula, then of Borneo, then of the Solomon Islands and Sumatra, now of the Philippines, and coming up soon of New Guinea, the Amazon, and the Congo Basin. In that case, the bad consequences are borne by the next generation, but that next generation cannot vote or complain.

A further situation involving rational behavior and conflicts of interest arises when the interests of the decision-making elite in power conflict with the interests of the rest of society. The elite are particularly likely to do things that profit them but hurt everybody else, if the elite are able to insulate themselves from the consequences of their actions. Such clashes are increasingly frequent in the modern U.S., where rich people tend to live within their gated compounds and to drink bottled water. For example, executives of Enron correctly calculated that they could gain huge sums of money for themselves by looting the company coffers and harming the rest of society, and that they were likely to get away with their gamble.

Failure to solve perceived problems because of conflicts of interest between the elite and the rest of society are much less likely in societies where the elite cannot insulate themselves from the consequences of their actions. For example, the modern country of which the highest proportions of its citizens belong to environmental organizations is the Netherlands. I never understood why until I was visiting the Netherlands a few years ago and raised this question to my Dutch colleagues as were driving through the countryside. My Dutch friends answered, "Just look around you and you will see the reason. The land where we are now is 22 feet below sea level. Like much of the area of Holland it was once a shallow bay of the sea that we Dutch people surrounded by dikes and then drained with pumps to create low-lying land that we call a polder. We have pumps to pump out the water that is continually leaking into our polders through the dikes. If the dikes burst, of course the people in the polder drown. But it is not the case that the rich Dutch live on top of the dikes, while the poor Dutch are living down in the polders. If the dikes burst, everybody drowns, regardless of whether they are rich or poor. That was what happened in the terrible floods of February 1, 1953, when high tides and storms drove water inland over the polders of Zeeland Province and nearly 2000 Dutch people drowned. After that disaster, we all swore, 'Never again!' and spent billions of dollars building reinforced barriers against the water. In the Netherlands the decision-makers know that they cannot insulate themselves from their mistakes, and that they have to make compromise decisions that will be good for as many people as possible."

Those examples illustrate situations in which a society fails to solve perceived problems because the maintenance of the problem is good for some people. In contrast to that so-called rational behavior, there are also failures to attempt to solve perceived problems that economists consider "irrational behavior": that is, the behavior is harmful for everybody. Such irrational behavior often arises when all of us are torn by clashes of values within each person. We may be strongly attached to a bad status quo because it is favored by some deeply held value that we admire. Religious values are especially deeply held and hence frequent causes of disastrous behavior. For example, much of the deforestation of Easter Island had a religious motivation, to obtain logs to transport and erect the giant stone statues that were the basis of Easter Island religious cults. In modern times a reason why Montanans have been so reluctant to solve the obvious problems now accumulating from mining, logging, and ranching in Montana is that these three industries were formerly the pillars of the Montana economy, and that they became bound up with the pioneer spirit and with Montanan self-identity.

Irrational failures to try to solve perceived problems also frequently arise from clashes between short-term and long-term motives of the same individual. Billions of people in the world today are desperately poor and able to think only of food for the next day. Poor fishermen in tropical reef areas use dynamite and cyanide to kill and catch reef fish, in full knowledge that they are destroying their future livelihood, but they feel that they have no choice because of their desperate short term need to obtain food for their children today. Governments, too, regularly operate on a short-term focus: they feel overwhelmed by imminent disasters, and pay attention only to those problems on the verge of explosion and feel that they lack time or resources to devote to long-term problems. For example, a friend of mine who is closely connected to the current federal administration in Washington, D.C. told me that, when he visited Washington for the first time after the year-2000 national elections, the leaders of our government had what he termed a "90-day focus": they talked about only those problems with the potential to cause a disaster within the next 90 days. Economists rationally justify these irrational focuses on short-term profits by "discounting" future profits. That is, they argue that it may be better to harvest a resource today than to leave some of the resource for harvesting tomorrow, because the profits from today's harvest could be invested, and the accumulated interest between now and a harvest of exactly that same quantity of resource in the future would make today's harvest more valuable than the future harvest.

The last reason that I shall mention for irrational failure to try to solve a perceived problem is psychological denial. This is a technical term with a precisely defined meaning in individual psychology, and it has been taken over into the pop culture. If something that you perceive arouses an unbearably painful emotion, you may subconsciously suppress or deny your perception in order to avoid the unbearable pain, even though the practical results of ignoring your perception may prove ultimately disastrous. The emotions most often responsible are terror, anxiety, and sadness. Typical examples include refusing to think about the likelihood that your husband, wife, child, or best friend may be dying, because the thought is so painfully sad, or else blocking out a terrifying experience. For example, consider a narrow deep river valley below a high dam, such that if the dam burst, the resulting flood of water would drown people for a long distance downstream. When attitude pollsters ask people downstream of the dam how concerned they are about the dam's bursting, it's not surprising that fear of a dam burst is lowest far downstream, and increases among residents increasingly close to the dam. Surprisingly, though, when one gets within a few miles of the dam, where fear of the dam's breaking is highest, as you then get closer to the dam the concern falls off to zero! That is, the people living immediately under the dam who are certain to be drowned in a dam burst profess unconcern. That is because of psychological denial: the only way of preserving one's sanity while living immediately under the high dam is to deny the finite possibility that it could burst.

Psychological denial is a phenomenon well established in individual psychology. lt seems likely to apply to group psychology as well. For example, there is much evidence that, during World War Two, Jews and other groups at risk of the developing Holocaust denied the accumulating evidence that it was happening and that they were at risk, because the thought was unbearably horrible. Psychological denial may also explain why some collapsing societies fail to face up to the obvious causes of their collapse.

Finally, the last of the four items in my road map is the failure to succeed in solving a problem that one does try to solve. There are obvious possible explanations for this outcome. The problem may just be too difficult, and beyond our present capacities to solve. For example, the state of Montana loses hundreds of millions of dollars per year in attempting to combat introduced weed species, such as spotted knapweed and leafy spurge. That is not because Montanans don't perceive these weeds or don't try to eliminate them, but simply because the weeds are too difficult to eliminate at present. Leafy spurge has roots 20 feet deep, too long to pull up by hand, and specific weed-control chemicals cost up to $800 per gallon.

Often, too, we fail to solve a problem because our efforts are too little, begun too late. For example, Australia has suffered tens of billions of dollars of agricultural losses, as well as the extinction or endangerment of most of its native small mammal species, because of introductions of European rabbits and foxes for which there was no close native counterpart in the Australian environment. Foxes as predators prey on lambs and chickens and kill native small marsupials and rodents. Foxes have been widespread over the Australian mainland for over a century, but until recently they were absent from the Australian island state of Tasmania, because foxes could not swim across the wide, rough seas between the Australian mainland and Tasmania. Unfortunately, two or three years ago some individuals surreptitiously and illegally released 32 foxes on the Tasmanian mainland, either for their fox-hunting pleasure or to spite environmentalists. Those foxes represent a big threat to Tasmanian lamb and chicken farmers, as well as to Tasmanian wildlife. When Tasmanian environmentalists became aware of this fox problem around March of 2002, they begged the government to exterminate the foxes quickly while it was still possible. The fox breeding season was expected to begin around July. Once those 32 foxes had produced litters and once those litters had dispersed, it would be far more difficult to eradicate 128 foxes than 32 foxes. Unfortunately, the Tasmanian government debated and delayed, and it was not until around June of 2002 that the government finally decided to commit a million dollars to eliminating foxes. By that time, there was considerable risk that the commitment of money was too little and too late, and that the Tasmanian government would find itself faced with a far more expensive and less soluble problem. I have not heard yet what happened to that fox eradication effort


~~~

Thus, human societies and smaller groups may make disastrous decisions for a whole sequence of reasons: failure to anticipate a problem, failure to perceive it once it has arisen, failure to attempt to solve it after it has been perceived, and failure to succeed in attempts to solve it. All this may sound pessimistic, as if failure is the rule in human decision-making. In fact, of course that is not the case, in the environmental area as in business, academia, and other groups. Many human societies have anticipated, perceived, tried to solve, or succeeded in solving their environmental problems. For example, the Inca Empire, New Guinea Highlanders, 18th-century Japan, 19th-century Germany, and the paramount chiefdom of Tonga all recognized the risks that they faced from deforestation, and all adopted successful reforestation or forest management policies.

Thus, my reason for discussing failures of human decision-making is not my desire to depress you. Instead, I hope that, by recognizing the sign posts of failed decision making, we may become more consciously aware of how others have failed, and of what we need to do in order to get it right.