2013 : WHAT *SHOULD* WE BE WORRIED ABOUT?

gary_marcus's picture
Professor of Psychology, Director NYU Center for Language and Music; Author, Guitar Zero
Unknown Unknowns

There are known knowns and known unknowns, but what we should be worried about most is the unknown unknowns. Not because they are the most serious risks we face, but because psychology tells us that unclear risks that are in the distant future are the risks we are less likely to take seriously enough.

At least four distinct psychological mechanisms are at work. First, we are moved more by vivid information, than by abstract information (even when the abstract information should in principle dominate). Second, we discount the future, rushing for the dollar now as opposed to the two dollars we could have a year later if we waited. Third, the focusing illusion (itself perhaps driven by the more general phenomenon of priming) tends to make us dwell on our most immediate problems, even if more serious problems loom in the background. Fourth, we have a tendency to believe in a just world, in which nature naturally rights itself.

These four mechanisms likely derive from different sources, some stemming from systems that govern motivation (future discounting), others from systems that mediate pleasure (belief in a just world), others from the structure of our memory (the focusing illusion, and the bias from vividness). Whatever their source, the four together create a potent psychological drive for us to underweight distant future risks that we cannot fully envision. Climate change is a case in point. In 1975, the Columbia University geochemist Wallace S. Broecker wrote an important and prescient article called "Climatic Change: Are We on the Brink of a Pronounced Global Warming?", but his worries were ignored for decades, in part because many people presumed, fallaciously, that nature would somehow automatically set itself right. (And, in keeping with people's tendency to draw their inference primarily from vivid information, a well-crafted feature film on climate change played a significant role in gathering public attention, arguably far more so than the original article in Science.)

Oxford philosopher Nick Bostrom has pointed out that the three greatest unknowns we should be worry about are biotechnology, nanotechnology, and the rise of machines that are more intelligent than human beings. Each sounds like science fiction, and has in fact been portrayed in science fiction, but each poses genuine threats. Bostrom's posits "existential risks": possible, if unlikely, calamities, that would wipe out our entire species, much as asteroids appear to have extinguished dinosaurs.

Importantly, many of these risks, in his judgment, exceed the existential risk of other concerns that occupy a considerably greater share of public attention. Climate change, may be more likely, and certainly is more vivid, but is less likely to lead to the complete extinction of the human species (even though it could conceivably kill a significant fraction of us).

The truth is that we simply don't know enough about the potential biotechnology, nanotechonology, or future iterations of artificial intelligence to calculate what their risks are, compelling arguments have been made that in principle any of the three could lead to human extinction. These risks may prove manageable, but I don't think we can manage them if we don't take them seriously. In the long run, biotech, nanotech and AI are probably significantly more likely to help the species, by increasing productivity and limiting disease, than they are to destroy it. But we need to invest more in figuring out exactly what the risks are, and to better prepare for then. Right now, the US spends more than $2.5 billion dollars a year studying climate change, but (by my informal reckoning) less than 1% of that total studying the risk of biotech, nanotech, and AI.

What we really should be worried about is that we are not quite doing enough to prepare for the unknown.