|
Edge
114 April 28, 2003 |
|
IN THE NEWS |
|
WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?: JARED DIAMOND Introduction "Throughout
history," states the LEWIS THOMAS PRIZE literature, "scientists
and poets have sought to unveil the secrets of the natural
world. Their methods vary: scientists use tools of rational
analysis to slake their compelling thirst for knowledge;
poets delve below the surface of language, and deliver urgent
communiqués from its depths. The Lewis Thomas Prize
honors the rare individual who is fluent in the dialects
of both realms — and who succeeds in spinning lush
literary and philosophical tapestries from the silken threads
of scientific and natural phenomena — providing not
merely new information but cause for reflection, even revelation." ~~~ Jared is an early and frequent contributor to Edge. In his first feature in 1997 ("Why Did Human History Unfold Differently On Different Continents For The Last 13,000 Years?") he stated:
Underlying his task is the question of how to turn the study of history into a science. He notes the distinction between the "hard sciences" such as physics, biology, and astronomy — and what we sometimes call the "social sciences," which includes history, economics, government. The social sciences are often thought of as a pejorative. In particular many of the so-called hard scientists such as physicists or biologists, don't consider history to be a science. The situation is even more extreme because, he points out, even historians themselves don't consider history to be a science. Historians don't get training in the scientific methods; they don't get training in statistics; they don't get training in the experimental method or problems of doing experiments on historical subjects; and they'll often say that history is not a science, history is closer to an art. He comes to this question as one who is accomplished in two scientific areas: physiology and evolutionary biology. The first is a laboratory science; the second, is never far from history. "Biology is the science," he says. "Evolution is the concept that makes biology unique." He continues to bring together history and biology in new and interesting ways to present global accounts of the rise and fall of civilizations. More
than one million copies of the U.S. edition of Jared Diamond's
Pulitzer Prize winning Guns, Germs, and Steel:The Fates
of Human Societies have now been sold. Jared hopes to
deliver his much-anticipated new book, Ecocide,
at the end of this year for publication in 2004. JB His
field experience includes 17 expeditions to New Guinea and
neighboring islands, to study ecology and evolution of birds;
rediscovery of New Guinea's long-lost goldenfronted bowerbird;
other field projects in North America, South America, Africa,
Asia, and Australia. As a conservationist he devised a comprehensive
plan, almost all of which was subsequently implemented, for
Indonesian New Guinea's national park system; numerous field
projects for the Indonesian government and World Wildlife
Fund; founding member of the board of the Society of Conservation
Biology; member of the Board of Directors of World Wildlife
Fund/USA. Further
reading on Edge: "Jared Diamond Awarded Pulitzer Prize for General Nonfiction" [4.15.98] "Laying
A Foundation For Human History" Bill Gates on
Jared Diamond [4.15.98] WHY DO SOME SOCIETIES MAKE DISASTROUS DECISIONS?: JARED DIAMOND
Education
is supposed to be about teachers imparting knowledge
to students. As every teacher knows, though, if you have
a good group of students, education is also about students
imparting knowledge to their supposed teachers and challenging
their assumptions. That's an experience that I've been
through in the last couple of months, when for the first
time in my academic career I gave a course to undergraduates,
highly motivated UCLA undergraduates, on collapses of
societies. Why is it that some societies in the past
have collapsed while others have not? I was discussing
famous collapses such as those of the Anasazi in the
U.S. Southwest, Classic Maya civilization in the Yucatan,
Easter Island society in the Pacific, Angkor Wat in southeast
Asia, Great Zimbabwe in Africa, Fertile Crescent societies,
and Harappan Indus Valley societies. These are all societies
that we've realized, from archaeological discoveries
in the last 20 years, hammered away at their own environments
and destroyed themselves in part by undermining the environmental
This question, why societies make disastrous decisions and destroy themselves, is one that not only surprised my UCLA undergraduates, but also astonishes professional historians studying collapses of past societies. The most cited book on the subject of the collapse of societies is by the historian, Joseph Tainter. It's entitled The Collapse of Complex Societies. Joseph Tainter, in discussing ancient collapses, rejected the possibility that those collapses might be due to environmental management because it seemed so unlikely to him. Here's what Joseph Tainter said: "As it becomes apparent to the members or administrators of a complex society that a resource base is deteriorating, it seems most reasonable to assume that some rational steps are taken towards a resolution. With their administrative structure and their capacity to allocate labor and resources, dealing with adverse environmental conditions may be one of the things that complex societies do best. It is curious that they would collapse when faced with precisely those conditions that they are equipped to circumvent." Joseph Tainter concluded that the collapses of all these ancient societies couldn't possibly be due to environmental mismanagement, because they would never make these bad mistakes. Yet it's now clear that they did make these bad mistakes. My UCLA undergraduates, and Joseph Tainter as well, have identified a very surprising question; namely, failures of group decision-making on the part of whole societies, or governments, or smaller groups, or businesses, or university academic departments. The question of failure of group decision-making is similar to questions of failures of individual decision-making. Individuals make bad decisions; they enter bad marriages, they make bad investments, their businesses fail. But in failures of group decision-making there are some additional factors, notably conflicts of interest among the members of the group that don't arise with failures of individual decision-making. This is obviously a complex question; there's no single answer to it. There are no agreed-on answers. What I'm going to suggest is a road map of factors in failures of group decision making. I'll divide the answers into a sequence of four somewhat fuzzily delineated categories. First of all, a group may fail to anticipate a problem before the problem actually arrives. Secondly, when the problem arrives, the group may fail to perceive the problem. Then, after they perceive the problem, they may fail even to try to solve the problem. Finally, they may try to solve it but may fail in their attempts to do so. While all this talking about reasons for failure and collapses of society may seem pessimistic, the flip side is optimistic: namely, successful decision-making. Perhaps if we understand the reasons why groups make bad decisions, we can use that knowledge as a check list to help groups make good decisions.
The first item on my road map is that groups may do disastrous things because they didn't anticipate a problem before it arrived. There may be several reasons for failure to anticipate a problem. One is that they may have had no prior experience of such problems, and so may not have been sensitized to the possibility. For example, consider forest fires in the U.S. West. My wife, my children and I spend parts of our summers in Montana, and each year when we fly into Montana I look out our plane window as our plane is coming in to see how many forest fires I see out there today. Forest fires are a major problem not only in Montana, but throughout the U.S. Intermontane West in general. Forest fires on that giant scale are unknown in the eastern United States and in Europe. When settlers from the eastern United States and Europe arrived in Montana and a forest fire arose, their reaction was, of course, that you should try to put out the fire. The motto of the U.S. Forest Service for nearly a century was: our goal is that every forest fire will be put out by 10:00 AM of the next morning after the day on which it has been reported. That attitude of easterners and Europeans about forest fires was because they had had no previous experience of forest fires in a dry environment where there's a big buildup of fuel, where trees that fall down into the understory don't rot away as in wet Europe and as in the wet eastern United States, but accumulate there in a dry environment. lt turns out that frequent small fires burn off the fuel load, and if you suppress those frequent small fires, then when eventually a fire is lit it may burn out of control far beyond one's ability to suppress it, resulting in the big disastrous fires in the U.S. Intermontane West. It turns out that the best way to deal with forest fires in the West is to let them burn, and burn out, and then there won't be a buildup of a fuel load resulting in a disaster. But these huge forest fires were something with which eastern Americans and Europeans had no prior experience. The idea that you should let a fire burn, and destroy valuable forest, was so counter-intuitive that it took the U.S. Forest Service a hundred years to realize the problem and to change the strategy and let the fire burn. So here's an example of how a society with no prior experience of a problem may not even recognize the problem — the problem of fuel loads in the understory of a dry forest. That's
not the only reason, though, why a society may fail to
anticipate a problem before it actually arises. Another
reason is that they may have had prior experience but that
prior experience has been forgotten. For example, a non
literate society is not going to preserve oral memories
of something that happened long in the past. The Classic
Lowland Maya eventually succumbed to a drought around 800
A.D. There had been previous droughts in the Maya realm,
but they could not draw on that prior experience, because
although the Maya had some writing, it just preserved the
conquests of kings and didn't record droughts. Maya droughts
recur at intervals of 208 years, so the Maya in 800 A.D.,
when the big drought struck, did not and could not remember
the drought of A.D. 592. The remaining reason why a society may fail to anticipate a problem before it develops involves reasoning by false analogy. When we are in an unfamiliar situation, we fall back on reasoning by analogy with old familiar situations. That's a good way to proceed if the old and new situations are truly analogous, but reasoning by analogy can be dangerous if the old and new situations are only superficially similar. An example of a society that suffered from disastrous consequences of reasoning by false analogy was the society of Norwegian Vikings who immigrated to Iceland beginning in the year AD 871. Their familiar homeland of Norway has heavy clay soils ground up by glaciers. Those soils are sufficiently heavy that, if the vegetation covering them is cut down, they are too heavy to be blown away. Unfortunately for the Viking colonists of Iceland, Icelandic soils are as light as talcum powder. They arose not through glacial grinding, but through winds carrying light ashes blown out in volcanic eruptions. The Vikings cleared the forests over those soils in order to create pasture for their animals. Unfortunately, the ash that was light enough for the wind to blow in was light enough for the wind to blow out again when the covering vegetation had been removed. Within a few generations of the Vikings' arriving in Iceland, half of Iceland's top soil had eroded into the ocean. Other examples of reasoning by false analogy abound. The
second step in my road map, after a society has anticipated
or failed to anticipate a problem before it arises, involves
a society's failing to perceive a problem that has actually
arrived. There are at least three reasons for such failures,
all of them common in the business world and in academia.
First, the origins of some problems are literally imperceptible.
For example, the nutrients responsible for soil fertility
are invisible to the eye, and only in modem times measurable
by means of chemical analysis. In Australia, Mangareva,
parts of the U.S. Southwest, and many other locations,
most of the nutrients had already been leached out of the
soil by rainfall. When people arrived and began growing
crops, those crops quickly exhausted the remaining nutrients,
so that agriculture rapidly failed. Yet such nutrient-poor
soils often bear lush-appearing vegetation; it's just that
most of the nutrients in the ecosystem are contained in
the vegetation rather than in the soil, so that the nutrients
are removed when one cuts down the vegetation. There was
no way that the first colonists of Australia and Mangareva
could perceive that problem of soil nutrient exhaustion. Politicians use the term "creeping normalcy" to refer to such slow trends concealed within noisy fluctuations. If a situation is getting worse only slowly, it is difficult to recognize that this year is worse than last year, and each successive year is only slightly worse than the year before, so that one's baseline standard for what constitutes "normalcy" shifts only gradually and almost imperceptibly. lt may take a few decades of a long sequence of such slight year-to-year changes before someone suddenly realizes that conditions were much better several decades ago, and that what is accepted as normalcy has crept downwards. The
remaining frequent reason for failure to perceive a problem
after it has arrived is distant managers, a potential problem
in any large society. For example, today the largest private
landowner and the largest timber company in the state of
Montana is based not within the state but in Seattle, Washington.
Not being on the scene, company executives may not realize
that they have a big weed problem on their forest property. The third step in my road map of failure is perhaps the commonest and most surprising one: a society's failure even to try to solve a problem that it has perceived. Such failures frequently arise because of what economists term "rational behavior" arising from clashes of interest between people. Some people may reason correctly that they can advance their own interests by behavior that is harmful for other people. Economists term such behavior "rational," even while acknowledging that morally it may be naughty. The perpetrators are often motivated and likely to get away with their rational bad behavior, because the winners from the bad status quo are typically concentrated (few in number) and highly motivated because they receive big, certain, immediate profits, while the losers are diffuse (the losses are spread over large numbers of individuals) and are unmotivated because they receive only small, uncertain, distant profits from undoing the rational bad behavior of the minority. A typical example of rational bad behavior is "good for me, bad for you and for the rest of society" — to put it bluntly, "selfishness." A few individuals may correctly perceive their self-interests to be opposed to the majority's self-interest. For example, until 1971, mining companies in Montana typically just dumped their toxic wastes of copper and arsenic directly into rivers and ponds because the state of Montana had no law requiring mining companies to clean up after abandoning a mine. After 1971, the state of Montana did pass such a law, but mining companies discovered that they could extract the valuable ore and then just declare bankruptcy before going to the expense of cleaning up. The result has been billions of dollars of clean-up costs borne by the citizens of the United States or Montana. The miners had correctly perceived that they could advance their interests and save money by making messes and leaving the burden to society. One particular form of such clashes of interest has received the name "tragedy of commons." That refers to a situation in which many consumers are harvesting a communally owned resource (such as fish in the ocean, or grass in common pastures), and in which there is no effective regulation of how much of the resource each consumer can draw off. Under those circumstances, each consumer can correctly reason "If I don't catch that fish or graze that grass, some other fisherman or herder will anyway, so it makes no sense for me to be careful about overfishing or overharvesting." The correct rational behavior is to harvest before the next consumer can, even though the end result is depletion or extinction of the resource, and hence harm for society as a whole. Rational behavior involving clashes of interest also arises when the consumer has no long-term stake in preserving the resource. For example, much commercial harvesting of tropical rainforests today is carried out by international logging companies, which lease land in one country, cut down all the rainforest in that country, and then move on to the next country. The international loggers have correctly perceived that, once they have paid for the lease, their interests are best served by clear-cutting the rainforest on their leased land. In that way, loggers have destroyed most of the forest of the Malay Peninsula, then of Borneo, then of the Solomon Islands and Sumatra, now of the Philippines, and coming up soon of New Guinea, the Amazon, and the Congo Basin. In that case, the bad consequences are borne by the next generation, but that next generation cannot vote or complain. A
further situation involving rational behavior and conflicts
of interest arises when the interests of the decision-making
elite in power conflict with the interests of the rest
of society. The elite are particularly likely to do things
that profit them but hurt everybody else, if the elite
are able to insulate themselves from the consequences of
their actions. Such clashes are increasingly frequent in
the modern U.S., where rich people tend to live within
their gated compounds and to drink bottled water. For example,
executives of Enron correctly calculated that they could
gain huge sums of money for themselves by looting the company
coffers and harming the rest of society, and that they
were likely to get away with their gamble. Irrational failures to try to solve perceived problems also frequently arise from clashes between short-term and long-term motives of the same individual. Billions of people in the world today are desperately poor and able to think only of food for the next day. Poor fishermen in tropical reef areas use dynamite and cyanide to kill and catch reef fish, in full knowledge that they are destroying their future livelihood, but they feel that they have no choice because of their desperate short term need to obtain food for their children today. Governments, too, regularly operate on a short-term focus: they feel overwhelmed by imminent disasters, and pay attention only to those problems on the verge of explosion and feel that they lack time or resources to devote to long-term problems. For example, a friend of mine who is closely connected to the current federal administration in Washington, D.C. told me that, when he visited Washington for the first time after the year-2000 national elections, the leaders of our government had what he termed a "90-day focus": they talked about only those problems with the potential to cause a disaster within the next 90 days. Economists rationally justify these irrational focuses on short-term profits by "discounting" future profits. That is, they argue that it may be better to harvest a resource today than to leave some of the resource for harvesting tomorrow, because the profits from today's harvest could be invested, and the accumulated interest between now and a harvest of exactly that same quantity of resource in the future would make today's harvest more valuable than the future harvest. The
last reason that I shall mention for irrational failure
to try to solve a perceived problem is psychological denial.
This is a technical term with a precisely defined meaning
in individual psychology, and it has been taken over into
the pop culture. If something that you perceive arouses
an unbearably painful emotion, you may subconsciously suppress
or deny your perception in order to avoid the unbearable
pain, even though the practical results of ignoring your
perception may prove ultimately disastrous. The emotions
most often responsible are terror, anxiety, and sadness.
Typical examples include refusing to think about the likelihood
that your husband, wife, child, or best friend may be dying,
because the thought is so painfully sad, or else blocking
out a terrifying experience. For example, consider a narrow
deep river valley below a high dam, such that if the dam
burst, the resulting flood of water would drown people
for a long distance downstream. When attitude pollsters
ask people downstream of the dam how concerned they are
about the dam's bursting, it's not surprising that fear
of a dam burst is lowest far downstream, and increases
among residents increasingly close to the dam. Surprisingly,
though, when one gets within a few miles of the dam, where
fear of the dam's breaking is highest, as you then get
closer to the dam the concern falls off to zero! That is,
the people living immediately under the dam who are certain
to be drowned in a dam burst profess unconcern. That is
because of psychological denial: the only way of preserving
one's sanity while living immediately under the high dam
is to deny the finite possibility that it could burst. Finally, the last of the four items in my road map is the failure to succeed in solving a problem that one does try to solve. There are obvious possible explanations for this outcome. The problem may just be too difficult, and beyond our present capacities to solve. For example, the state of Montana loses hundreds of millions of dollars per year in attempting to combat introduced weed species, such as spotted knapweed and leafy spurge. That is not because Montanans don't perceive these weeds or don't try to eliminate them, but simply because the weeds are too difficult to eliminate at present. Leafy spurge has roots 20 feet deep, too long to pull up by hand, and specific weed-control chemicals cost up to $800 per gallon. Often, too, we fail to solve a problem because our efforts are too little, begun too late. For example, Australia has suffered tens of billions of dollars of agricultural losses, as well as the extinction or endangerment of most of its native small mammal species, because of introductions of European rabbits and foxes for which there was no close native counterpart in the Australian environment. Foxes as predators prey on lambs and chickens and kill native small marsupials and rodents. Foxes have been widespread over the Australian mainland for over a century, but until recently they were absent from the Australian island state of Tasmania, because foxes could not swim across the wide, rough seas between the Australian mainland and Tasmania. Unfortunately, two or three years ago some individuals surreptitiously and illegally released 32 foxes on the Tasmanian mainland, either for their fox-hunting pleasure or to spite environmentalists. Those foxes represent a big threat to Tasmanian lamb and chicken farmers, as well as to Tasmanian wildlife. When Tasmanian environmentalists became aware of this fox problem around March of 2002, they begged the government to exterminate the foxes quickly while it was still possible. The fox breeding season was expected to begin around July. Once those 32 foxes had produced litters and once those litters had dispersed, it would be far more difficult to eradicate 128 foxes than 32 foxes. Unfortunately, the Tasmanian government debated and delayed, and it was not until around June of 2002 that the government finally decided to commit a million dollars to eliminating foxes. By that time, there was considerable risk that the commitment of money was too little and too late, and that the Tasmanian government would find itself faced with a far more expensive and less soluble problem. I have not heard yet what happened to that fox eradication effort
Thus,
my reason for discussing failures of human decision-making
is not my desire to depress you. Instead, I hope that,
by recognizing the sign posts of failed decision making,
we may become more consciously aware of how others have
failed, and of what we need to do in order to get it right. |
John Brockman,
Editor and Publisher |
|Top|
|