WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS? [1]

WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?

[JARED DIAMOND:] Education is supposed to be about teachers imparting knowledge to students. As every teacher knows, though, if you have a good group of students, education is also about students imparting knowledge to their supposed teachers and challenging their assumptions. That's an experience that I've been through in the last couple of months, when for the first time in my academic career I gave a course to undergraduates, highly motivated UCLA undergraduates, on collapses of societies. Why is it that some societies in the past have collapsed while others have not? I was discussing famous collapses such as those of the Anasazi in the U.S. Southwest, Classic Maya civilization in the Yucatan, Easter Island society in the Pacific, Angkor Wat in southeast Asia, Great Zimbabwe in Africa, Fertile Crescent societies, and Harappan Indus Valley societies. These are all societies that we've realized, from archaeological discoveries in the last 20 years, hammered away at their own environments and destroyed themselves in part by undermining the environmental resources on which they depended.

For example, the Easter Islanders, Polynesian people, settled an island that was originally forested, and whose forests included the world's largest palm tree. The Easter Islanders gradually chopped down that forest to use the wood for canoes, firewood, transporting statues, raising statues, and carving and also to protect against soil erosion. Eventually they chopped down all the forests to the point where all the tree species were extinct, which meant that they ran out of canoes, they could no longer erect statues, there were no longer trees to protect the topsoil against erosion, and their society collapsed in an epidemic of cannibalism that left 90 percent of the islanders dead. The question that most intrigued my UCLA students was one that hadn't registered on me: how on Earth could a society make such an obviously disastrous decision as to cut down all the trees on which they depended? For example, my students wondered, what did the Easter Islanders say as they were cutting down the last palm tree? Were they saying, think of our jobs as loggers, not these trees? Were they saying, respect my private property rights? Surely the Easter Islanders, of all people, must have realized the consequences to them of destroying their own forest. It wasn't a subtle mistake. One wonders whether — if there are still people left alive a hundred years from now — people in the next century will be equally astonished about our blindness today as we are today about the blindness of the Easter Islanders.

This question, why societies make disastrous decisions and destroy themselves, is one that not only surprised my UCLA undergraduates, but also astonishes professional historians studying collapses of past societies. The most cited book on the subject of the collapse of societies is by the historian, Joseph Tainter. It's entitled The Collapse of Complex Societies. Joseph Tainter, in discussing ancient collapses, rejected the possibility that those collapses might be due to environmental management because it seemed so unlikely to him. Here's what Joseph Tainter said: "As it becomes apparent to the members or administrators of a complex society that a resource base is deteriorating, it seems most reasonable to assume that some rational steps are taken towards a resolution. With their administrative structure and their capacity to allocate labor and resources, dealing with adverse environmental conditions may be one of the things that complex societies do best. It is curious that they would collapse when faced with precisely those conditions that they are equipped to circumvent." Joseph Tainter concluded that the collapses of all these ancient societies couldn't possibly be due to environmental mismanagement, because they would never make these bad mistakes. Yet it's now clear that they did make these bad mistakes.

My UCLA undergraduates, and Joseph Tainter as well, have identified a very surprising question; namely, failures of group decision-making on the part of whole societies, or governments, or smaller groups, or businesses, or university academic departments. The question of failure of group decision-making is similar to questions of failures of individual decision-making. Individuals make bad decisions; they enter bad marriages, they make bad investments, their businesses fail. But in failures of group decision-making there are some additional factors, notably conflicts of interest among the members of the group that don't arise with failures of individual decision-making. This is obviously a complex question; there's no single answer to it. There are no agreed-on answers.

What I'm going to suggest is a road map of factors in failures of group decision making. I'll divide the answers into a sequence of four somewhat fuzzily delineated categories. First of all, a group may fail to anticipate a problem before the problem actually arrives. Secondly, when the problem arrives, the group may fail to perceive the problem. Then, after they perceive the problem, they may fail even to try to solve the problem. Finally, they may try to solve it but may fail in their attempts to do so. While all this talking about reasons for failure and collapses of society may seem pessimistic, the flip side is optimistic: namely, successful decision-making. Perhaps if we understand the reasons why groups make bad decisions, we can use that knowledge as a check list to help groups make good decisions.

The first item on my road map is that groups may do disastrous things because they didn't anticipate a problem before it arrived. There may be several reasons for failure to anticipate a problem. One is that they may have had no prior experience of such problems, and so may not have been sensitized to the possibility. For example, consider forest fires in the U.S. West. My wife, my children and I spend parts of our summers in Montana, and each year when we fly into Montana I look out our plane window as our plane is coming in to see how many forest fires I see out there today. Forest fires are a major problem not only in Montana, but throughout the U.S. Intermontane West in general. Forest fires on that giant scale are unknown in the eastern United States and in Europe. When settlers from the eastern United States and Europe arrived in Montana and a forest fire arose, their reaction was, of course, that you should try to put out the fire. The motto of the U.S. Forest Service for nearly a century was: our goal is that every forest fire will be put out by 10:00 AM of the next morning after the day on which it has been reported. That attitude of easterners and Europeans about forest fires was because they had had no previous experience of forest fires in a dry environment where there's a big buildup of fuel, where trees that fall down into the understory don't rot away as in wet Europe and as in the wet eastern United States, but accumulate there in a dry environment. lt turns out that frequent small fires burn off the fuel load, and if you suppress those frequent small fires, then when eventually a fire is lit it may burn out of control far beyond one's ability to suppress it, resulting in the big disastrous fires in the U.S. Intermontane West. It turns out that the best way to deal with forest fires in the West is to let them burn, and burn out, and then there won't be a buildup of a fuel load resulting in a disaster. But these huge forest fires were something with which eastern Americans and Europeans had no prior experience. The idea that you should let a fire burn, and destroy valuable forest, was so counter-intuitive that it took the U.S. Forest Service a hundred years to realize the problem and to change the strategy and let the fire burn. So here's an example of how a society with no prior experience of a problem may not even recognize the problem — the problem of fuel loads in the understory of a dry forest.

That's not the only reason, though, why a society may fail to anticipate a problem before it actually arises. Another reason is that they may have had prior experience but that prior experience has been forgotten. For example, a non literate society is not going to preserve oral memories of something that happened long in the past. The Classic Lowland Maya eventually succumbed to a drought around 800 A.D. There had been previous droughts in the Maya realm, but they could not draw on that prior experience, because although the Maya had some writing, it just preserved the conquests of kings and didn't record droughts. Maya droughts recur at intervals of 208 years, so the Maya in 800 A.D., when the big drought struck, did not and could not remember the drought of A.D. 592.

In modern literate societies, even though we do have writing, that does not necessarily mean that we can draw on our prior experience. We, too, tend to forget things, and so for example Americans recently behave as if they've forgotten about the 1973 Gulf oil crisis. For a year or two after the crisis they avoided gas-guzzling vehicles, then quickly they forgot that knowledge, despite their having writing. And again in the 1960s the city of Tucson, Arizona went through a severe drought, and the citizens swore that they would manage their water better after that, but within a decade or two Tucson was going back to its water-guzzling ways of building golf courses and watering one's gardens. So there we have a couple of reasons why a society may fail to anticipate a problem before it has arrived.

The remaining reason why a society may fail to anticipate a problem before it develops involves reasoning by false analogy. When we are in an unfamiliar situation, we fall back on reasoning by analogy with old familiar situations. That's a good way to proceed if the old and new situations are truly analogous, but reasoning by analogy can be dangerous if the old and new situations are only superficially similar.

An example of a society that suffered from disastrous consequences of reasoning by false analogy was the society of Norwegian Vikings who immigrated to Iceland beginning in the year AD 871. Their familiar homeland of Norway has heavy clay soils ground up by glaciers. Those soils are sufficiently heavy that, if the vegetation covering them is cut down, they are too heavy to be blown away. Unfortunately for the Viking colonists of Iceland, Icelandic soils are as light as talcum powder. They arose not through glacial grinding, but through winds carrying light ashes blown out in volcanic eruptions. The Vikings cleared the forests over those soils in order to create pasture for their animals. Unfortunately, the ash that was light enough for the wind to blow in was light enough for the wind to blow out again when the covering vegetation had been removed. Within a few generations of the Vikings' arriving in Iceland, half of Iceland's top soil had eroded into the ocean. Other examples of reasoning by false analogy abound.

The second step in my road map, after a society has anticipated or failed to anticipate a problem before it arises, involves a society's failing to perceive a problem that has actually arrived. There are at least three reasons for such failures, all of them common in the business world and in academia. First, the origins of some problems are literally imperceptible. For example, the nutrients responsible for soil fertility are invisible to the eye, and only in modem times measurable by means of chemical analysis. In Australia, Mangareva, parts of the U.S. Southwest, and many other locations, most of the nutrients had already been leached out of the soil by rainfall. When people arrived and began growing crops, those crops quickly exhausted the remaining nutrients, so that agriculture rapidly failed. Yet such nutrient-poor soils often bear lush-appearing vegetation; it's just that most of the nutrients in the ecosystem are contained in the vegetation rather than in the soil, so that the nutrients are removed when one cuts down the vegetation. There was no way that the first colonists of Australia and Mangareva could perceive that problem of soil nutrient exhaustion.

An even commoner reason for a society's failing to perceive a problem is that the problem may take the form of a slow trend concealed by wide up-and-down fluctuations. The prime example in modern times is global warming. We now realize that temperatures around the world have been slowly rising in recent decades, due in large part to changes in the atmosphere caused by humans. However, it is not the case that the climate each year is inexorably 0.17 degrees warmer than in the previous year. Instead, as we all know, climate fluctuates up and down erratically from year to year: three degrees warmer in one summer than the previous summer, then two degrees warmer the next summer, down four degrees the following summer, down another degree the next summer, then up five degrees, etc. With such wide and unpredictable fluctuations, it takes a long time to discern the upwards trend within that noisy signal. That's why it was only a few years ago that the last professional climatologist previously skeptical of the reality of global warming became convinced. Our president is still not convinced of the reality of global warming, and he thinks that we need more research. The medieval Greenlanders had similar difficulties in recognizing that the climate was gradually becoming colder, and the Maya of the Yucatan had difficulties discerning that the climate was gradually becoming drier.

Politicians use the term "creeping normalcy" to refer to such slow trends concealed within noisy fluctuations. If a situation is getting worse only slowly, it is difficult to recognize that this year is worse than last year, and each successive year is only slightly worse than the year before, so that one's baseline standard for what constitutes "normalcy" shifts only gradually and almost imperceptibly. lt may take a few decades of a long sequence of such slight year-to-year changes before someone suddenly realizes that conditions were much better several decades ago, and that what is accepted as normalcy has crept downwards.

The remaining frequent reason for failure to perceive a problem after it has arrived is distant managers, a potential problem in any large society. For example, today the largest private landowner and the largest timber company in the state of Montana is based not within the state but in Seattle, Washington. Not being on the scene, company executives may not realize that they have a big weed problem on their forest property.

All of us who belong to other groups can think of examples of imperceptibly arising problems, creeping normalcy, and distant managers.

The third step in my road map of failure is perhaps the commonest and most surprising one: a society's failure even to try to solve a problem that it has perceived.

Such failures frequently arise because of what economists term "rational behavior" arising from clashes of interest between people. Some people may reason correctly that they can advance their own interests by behavior that is harmful for other people. Economists term such behavior "rational," even while acknowledging that morally it may be naughty. The perpetrators are often motivated and likely to get away with their rational bad behavior, because the winners from the bad status quo are typically concentrated (few in number) and highly motivated because they receive big, certain, immediate profits, while the losers are diffuse (the losses are spread over large numbers of individuals) and are unmotivated because they receive only small, uncertain, distant profits from undoing the rational bad behavior of the minority.

A typical example of rational bad behavior is "good for me, bad for you and for the rest of society" — to put it bluntly, "selfishness." A few individuals may correctly perceive their self-interests to be opposed to the majority's self-interest. For example, until 1971, mining companies in Montana typically just dumped their toxic wastes of copper and arsenic directly into rivers and ponds because the state of Montana had no law requiring mining companies to clean up after abandoning a mine. After 1971, the state of Montana did pass such a law, but mining companies discovered that they could extract the valuable ore and then just declare bankruptcy before going to the expense of cleaning up. The result has been billions of dollars of clean-up costs borne by the citizens of the United States or Montana. The miners had correctly perceived that they could advance their interests and save money by making messes and leaving the burden to society.

One particular form of such clashes of interest has received the name "tragedy of commons." That refers to a situation in which many consumers are harvesting a communally owned resource (such as fish in the ocean, or grass in common pastures), and in which there is no effective regulation of how much of the resource each consumer can draw off. Under those circumstances, each consumer can correctly reason "If I don't catch that fish or graze that grass, some other fisherman or herder will anyway, so it makes no sense for me to be careful about overfishing or overharvesting." The correct rational behavior is to harvest before the next consumer can, even though the end result is depletion or extinction of the resource, and hence harm for society as a whole.

Rational behavior involving clashes of interest also arises when the consumer has no long-term stake in preserving the resource. For example, much commercial harvesting of tropical rainforests today is carried out by international logging companies, which lease land in one country, cut down all the rainforest in that country, and then move on to the next country. The international loggers have correctly perceived that, once they have paid for the lease, their interests are best served by clear-cutting the rainforest on their leased land. In that way, loggers have destroyed most of the forest of the Malay Peninsula, then of Borneo, then of the Solomon Islands and Sumatra, now of the Philippines, and coming up soon of New Guinea, the Amazon, and the Congo Basin. In that case, the bad consequences are borne by the next generation, but that next generation cannot vote or complain.

A further situation involving rational behavior and conflicts of interest arises when the interests of the decision-making elite in power conflict with the interests of the rest of society. The elite are particularly likely to do things that profit them but hurt everybody else, if the elite are able to insulate themselves from the consequences of their actions. Such clashes are increasingly frequent in the modern U.S., where rich people tend to live within their gated compounds and to drink bottled water. For example, executives of Enron correctly calculated that they could gain huge sums of money for themselves by looting the company coffers and harming the rest of society, and that they were likely to get away with their gamble.

Failure to solve perceived problems because of conflicts of interest between the elite and the rest of society are much less likely in societies where the elite cannot insulate themselves from the consequences of their actions. For example, the modern country of which the highest proportions of its citizens belong to environmental organizations is the Netherlands. I never understood why until I was visiting the Netherlands a few years ago and raised this question to my Dutch colleagues as were driving through the countryside. My Dutch friends answered, "Just look around you and you will see the reason. The land where we are now is 22 feet below sea level. Like much of the area of Holland it was once a shallow bay of the sea that we Dutch people surrounded by dikes and then drained with pumps to create low-lying land that we call a polder. We have pumps to pump out the water that is continually leaking into our polders through the dikes. If the dikes burst, of course the people in the polder drown. But it is not the case that the rich Dutch live on top of the dikes, while the poor Dutch are living down in the polders. If the dikes burst, everybody drowns, regardless of whether they are rich or poor. That was what happened in the terrible floods of February 1, 1953, when high tides and storms drove water inland over the polders of Zeeland Province and nearly 2000 Dutch people drowned. After that disaster, we all swore, 'Never again!' and spent billions of dollars building reinforced barriers against the water. In the Netherlands the decision-makers know that they cannot insulate themselves from their mistakes, and that they have to make compromise decisions that will be good for as many people as possible."

Those examples illustrate situations in which a society fails to solve perceived problems because the maintenance of the problem is good for some people. In contrast to that so-called rational behavior, there are also failures to attempt to solve perceived problems that economists consider "irrational behavior": that is, the behavior is harmful for everybody. Such irrational behavior often arises when all of us are torn by clashes of values within each person. We may be strongly attached to a bad status quo because it is favored by some deeply held value that we admire. Religious values are especially deeply held and hence frequent causes of disastrous behavior. For example, much of the deforestation of Easter Island had a religious motivation, to obtain logs to transport and erect the giant stone statues that were the basis of Easter Island religious cults. In modern times a reason why Montanans have been so reluctant to solve the obvious problems now accumulating from mining, logging, and ranching in Montana is that these three industries were formerly the pillars of the Montana economy, and that they became bound up with the pioneer spirit and with Montanan self-identity.

Irrational failures to try to solve perceived problems also frequently arise from clashes between short-term and long-term motives of the same individual. Billions of people in the world today are desperately poor and able to think only of food for the next day. Poor fishermen in tropical reef areas use dynamite and cyanide to kill and catch reef fish, in full knowledge that they are destroying their future livelihood, but they feel that they have no choice because of their desperate short term need to obtain food for their children today. Governments, too, regularly operate on a short-term focus: they feel overwhelmed by imminent disasters, and pay attention only to those problems on the verge of explosion and feel that they lack time or resources to devote to long-term problems. For example, a friend of mine who is closely connected to the current federal administration in Washington, D.C. told me that, when he visited Washington for the first time after the year-2000 national elections, the leaders of our government had what he termed a "90-day focus": they talked about only those problems with the potential to cause a disaster within the next 90 days. Economists rationally justify these irrational focuses on short-term profits by "discounting" future profits. That is, they argue that it may be better to harvest a resource today than to leave some of the resource for harvesting tomorrow, because the profits from today's harvest could be invested, and the accumulated interest between now and a harvest of exactly that same quantity of resource in the future would make today's harvest more valuable than the future harvest.

The last reason that I shall mention for irrational failure to try to solve a perceived problem is psychological denial. This is a technical term with a precisely defined meaning in individual psychology, and it has been taken over into the pop culture. If something that you perceive arouses an unbearably painful emotion, you may subconsciously suppress or deny your perception in order to avoid the unbearable pain, even though the practical results of ignoring your perception may prove ultimately disastrous. The emotions most often responsible are terror, anxiety, and sadness. Typical examples include refusing to think about the likelihood that your husband, wife, child, or best friend may be dying, because the thought is so painfully sad, or else blocking out a terrifying experience. For example, consider a narrow deep river valley below a high dam, such that if the dam burst, the resulting flood of water would drown people for a long distance downstream. When attitude pollsters ask people downstream of the dam how concerned they are about the dam's bursting, it's not surprising that fear of a dam burst is lowest far downstream, and increases among residents increasingly close to the dam. Surprisingly, though, when one gets within a few miles of the dam, where fear of the dam's breaking is highest, as you then get closer to the dam the concern falls off to zero! That is, the people living immediately under the dam who are certain to be drowned in a dam burst profess unconcern. That is because of psychological denial: the only way of preserving one's sanity while living immediately under the high dam is to deny the finite possibility that it could burst.

Psychological denial is a phenomenon well established in individual psychology. lt seems likely to apply to group psychology as well. For example, there is much evidence that, during World War Two, Jews and other groups at risk of the developing Holocaust denied the accumulating evidence that it was happening and that they were at risk, because the thought was unbearably horrible. Psychological denial may also explain why some collapsing societies fail to face up to the obvious causes of their collapse.

Finally, the last of the four items in my road map is the failure to succeed in solving a problem that one does try to solve. There are obvious possible explanations for this outcome. The problem may just be too difficult, and beyond our present capacities to solve. For example, the state of Montana loses hundreds of millions of dollars per year in attempting to combat introduced weed species, such as spotted knapweed and leafy spurge. That is not because Montanans don't perceive these weeds or don't try to eliminate them, but simply because the weeds are too difficult to eliminate at present. Leafy spurge has roots 20 feet deep, too long to pull up by hand, and specific weed-control chemicals cost up to $800 per gallon.

Often, too, we fail to solve a problem because our efforts are too little, begun too late. For example, Australia has suffered tens of billions of dollars of agricultural losses, as well as the extinction or endangerment of most of its native small mammal species, because of introductions of European rabbits and foxes for which there was no close native counterpart in the Australian environment. Foxes as predators prey on lambs and chickens and kill native small marsupials and rodents. Foxes have been widespread over the Australian mainland for over a century, but until recently they were absent from the Australian island state of Tasmania, because foxes could not swim across the wide, rough seas between the Australian mainland and Tasmania. Unfortunately, two or three years ago some individuals surreptitiously and illegally released 32 foxes on the Tasmanian mainland, either for their fox-hunting pleasure or to spite environmentalists. Those foxes represent a big threat to Tasmanian lamb and chicken farmers, as well as to Tasmanian wildlife. When Tasmanian environmentalists became aware of this fox problem around March of 2002, they begged the government to exterminate the foxes quickly while it was still possible. The fox breeding season was expected to begin around July. Once those 32 foxes had produced litters and once those litters had dispersed, it would be far more difficult to eradicate 128 foxes than 32 foxes. Unfortunately, the Tasmanian government debated and delayed, and it was not until around June of 2002 that the government finally decided to commit a million dollars to eliminating foxes. By that time, there was considerable risk that the commitment of money was too little and too late, and that the Tasmanian government would find itself faced with a far more expensive and less soluble problem. I have not heard yet what happened to that fox eradication effort


~~~

Thus, human societies and smaller groups may make disastrous decisions for a whole sequence of reasons: failure to anticipate a problem, failure to perceive it once it has arisen, failure to attempt to solve it after it has been perceived, and failure to succeed in attempts to solve it. All this may sound pessimistic, as if failure is the rule in human decision-making. In fact, of course that is not the case, in the environmental area as in business, academia, and other groups. Many human societies have anticipated, perceived, tried to solve, or succeeded in solving their environmental problems. For example, the Inca Empire, New Guinea Highlanders, 18th-century Japan, 19th-century Germany, and the paramount chiefdom of Tonga all recognized the risks that they faced from deforestation, and all adopted successful reforestation or forest management policies.

Thus, my reason for discussing failures of human decision-making is not my desire to depress you. Instead, I hope that, by recognizing the sign posts of failed decision making, we may become more consciously aware of how others have failed, and of what we need to do in order to get it right.