2015 : WHAT DO YOU THINK ABOUT MACHINES THAT THINK?

nicholas_a_christakis's picture
Sterling Professor of Social and Natural Science, Yale University; Co-author, Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives
Human Culture As The First Artificial Intelligence

For me, AI is not about complex software, humanoid robots, Turing tests, or hopes and fears regarding kind or evil machines. I think the central issue with respect to AI is whether thoughts exist outside minds. And manufactured machines are not the only example of such a possibility. Because, when I think of AI, I think of human culture and of other forms of (un-self-aware) collective ideation.

Culture is the earliest sort of intelligence outside our own minds that we humans created. Like the intelligence of a machine, culture can solve problems. Moreover, like the intelligence in a machine, we create culture, interact with it, are affected by it, and can even be destroyed by it. Culture applies its own logic, has a memory, endures after its makers are gone, can be repurposed in supple ways, and can induce action.

So I oxymoronically see culture as a kind of natural artificial intelligence. It is artificial because it is made, manufactured, produced by humans. It is natural in that it is everywhere that humans are, and it comes organically to us. In fact, it's even likely that our biology and our culture are deeply intertwined, and have co-evolved, so that our culture shapes our genes and our genes shape our culture.

Humans are not the only animals to have culture. Many bird and mammal species evince specific cultures related to communication and tool use—ranging from song in birds to sponge use among dolphins. Some animal species even have pharmacopeias. And recent evidence, in fact, shows how novel cultural forms can be experimentally prompted to take root in species other than our own.

We and other animals can evince a kind of thought outside minds in additional ways. Insect and bird groups perform computations by combining the information of many to identify locations of nests or food. One of the humblest organisms on earth, the amoeboid fungus physarum, can, in the proper laboratory conditions, exhibit a kind of intelligence, and solve mazes or perform other computational feats.

These thinking properties of groups that lie outside individual minds—this natural artificial intelligence—can even be experimentally manipulated. A team in Japan has used swarms of soldier crabs to make a simple computer circuit; they used particular elements of crab behavior to construct a system in the lab in which crabs gave (usually) predictable responses to inputs, and the swarm of crabs was used as a kind of computer, twisting crab behavior for a wholly new purpose. Analogously, Sam Arbesman and I once used a quirk of human behavior to fashion a so-called NOR gate and develop a (ridiculously slow) human computer, in a kind of synthetic sociology. We gave humans computer-like properties, rather than giving computers human-like properties.

What is the point of this extended analogy between AI and human culture? An examination of our relationship to culture can provide insights into what our relationship to machine AI might be like. We have a love-hate relationship with culture. We fear it for its force—as when religious fundamentalism or fascism whips small or large numbers of people into dangerous acts. But we also revere it because it can do things we cannot do as individuals, like fostering collective action or making life easier by providing unspoken assumptions on which we can base our lives. Moreover, we typically take culture for granted too, just as we already take nascent forms of AI for granted, and just as we will likely take fuller forms of AI for granted. Finally, gene-culture co-evolution might even provide a model for how we and thinking machines might get along over many centuries—mutually affecting each other and co-evolving.

When I think about machines that think, I am therefore just exactly as awestruck with them as I am with culture, and I am no more, or less, afraid of AI than I of human culture itself.