We need to trust other people. In an information dense society in which you have so much information, you cannot just count on your own means. You need to trust other people, and what does it mean to trust other people? Does it mean to become gullible? Does it mean to become credulous? I started to work in some specific domains to try and understand what it means to trust other people in order to acquire knowledge, to acquire some reliable information. What do we do? Are we entitled to do this? Is this an appropriate way of using our mind or of doing inference?
I ended up with some interesting empirical research, and also with some interesting normative claims about when we are entitled to trust. When are we rational in using our trust, or when are we gullible? These are cases where there are heuristics, there are biases, there are different circumstances in which we can be more or less gullible.
My research mixes up a French way of thinking—social sciences and philosophy—and a more analytical and Anglo Saxon approach, so, in a sense, I'm a foreigner everywhere. I have the privilege of being able to mix up all the possible traditions without feeling that I'm contaminating myself. This is a big freedom, not to belong to any place. I'm an Italian. I have studied analytical philosophy and cognitive science in France, and I was exposed, living in Paris, to the French culture. I feel quite free to use Foucault and Darwin in the same article without feeling intimidated by disciplinary or ideological boundaries. It was more difficult at times to be heard, but it was a big freedom for me to be able to use any possible corpus of knowledge with a lot of personal freedom, which is an important value for me.
I started to mix different literature and tried to understand in some domains how we make sense of our trust in a domain. For example, in the academic domain. That was perfect because it is my field. I work in the academy; I am a researcher in the Centre National de la Recherche Scientifique in Paris. There is a tradition in sociology in France to take academy as an example to be self reflexive. Most notably, Pierre Bourdieu, who has been a very influential sociologist in France, worked on how credibility of knowledge in the academy is created.
I started to work on this, and I discovered many interesting things. In general, we have two kinds of constraints. In my work I'm interested in explaining and in interpreting these two constraints on how we make sense of a certain corpus of knowledge. One kind is structural constraints on that corpus of knowledge. We know, for example, that academic publications are a network of citations. Citation networks have certain structured properties. Citation networks have the tendency of being aristocratic, which means that rich get richer. The more citations you receive, the more you will receive in the future. That is an effect of the network. It doesn't have anything to do with your will, with your cognitive system. It is just how the network is organized.
This is one kind of constraint that I'm interested in analyzing. It was the famous American sociologist, Robert Merton, who was teaching in Columbia University, who called this effect the Matthew effect. In the New Testament you have the four texts from John, Matthew—I don't remember the others—and Matthew says, "The rich will have more, and the poor will have less." Merton called this the Matthew effect in order to describe citation networks and academic prestige. If you're prestigious in academia, you will have more prestige, and if you're just marginal, the tendency will be that you will be more and more marginal.
These kinds of effects are structural effects that depend on the shape of the network. Another network that works in this way is page rank. Page rank is an aristocratic network, so it has the tendency to prize the people who are on the top. You have other networks that are sadly very democratic like illness, epidemics. Contagion is very democratic; it's not aristocratic. Everybody can be contagious. The shape of a network of a phenomenon is important to understand, the structural constraints of this phenomenon.
Page rank, Facebook, or eBay, all these networks are a little different. Their structural constraints are different, so you can have predictions of who goes up and down in these different networks by taking into account the constraints. On the other hand, you have our minds, and you have all the constraints on the way in which we deal with a corpus of knowledge that has to do with our biases, our heuristics, also with our previous knowledge and the fact that we are going to privilege some information. We have an enormous literature today on these cognitive biases. Daniel Kahneman, Gerd Gigerenzer, and so many other people in behavioral economics, social psychology, and cognitive psychology have made us aware of the enormous number of biases that we have when we deal with certain phenomenon. It can distort the way in which we perceive the phenomenon.
My idea is that if you want to understand what knowledge is and what we can extract in a reliable way from this complex bundle of information that invades us, we have to deal with these two aspects: the cognitive constraints on the way in which we perceive a certain corpus of knowledge—the heuristics and the mistakes that we make—and on the other hand, the structural constraints that exist on the corpus.
I don't end up being constructivist or relativistic about knowledge. I don't end up saying, "Well, knowledge is constructed by our psychological biases and by the structural constraints of a certain network or a certain organization of knowledge." I'm not a skeptic about knowledge, but the way in which we construct our knowledge institutions matters a lot in order to understand what will be filtered as knowledge in a certain time span, in a certain era, for a certain society. It is important to know the constraints.
Also, there is a normative part of my work in which people should be aware of some of the biases they have in such an informationally dense environment, like Internet. This is something that could be good for education, etc. We tend to trust things and chunks of information sometimes on the basis of very poor heuristics. This is something that can be easily corrected, for example, just by teaching children to look at the URL, how the URL is written. It contains a lot of information about the reliability of the site. It is something that you should be able to do automatically.
I like fieldwork in social science, but I mix up epistemology and social science. My domain, in academic terms, is called social epistemology—how the social constraints have an influence on our way of processing knowledge. My own method in social epistemology has been in the last six or seven years, and it was influenced by the huge impact of the advent of societal information, by trying to develop a second-order epistemology. What do I mean with second-order epistemology?
First order epistemology tells you that in order to distinguish knowledge, to pry apart knowledge from belief, you need to check some constraints on a chunk of information. Is it logically structured in the appropriate way? What are the inferential consequences that can grow from that chunk of information? Are these consequences, for example, contradicting previous knowledge? If they are, be careful. This is first order epistemology. You try to check the reliability of a chunk of information by some methods. There have been many theories that we have developed in the history of thought in order to check the reliability of information or the scientific method.
Second-order epistemology is something that we needed today, given that you cannot check these details on the chunk of information. You can check the indirect reliability of the information like its authority, its reputation, where it comes from, who said that, the strength or the weight the person who has said this thing gives to that chunk of information. All this is something new. We have a spontaneous reaction to ways of attributing authority to other people: "This must be an authoritative guy, because he's on the Edge site." We have a lot of intuition about how to access this indirect index of reliability. In my work I try to make this intuition a little more controllable. I try to see how this intuition sometimes is just wrong, and hopeless. Sometimes they can use heuristics in order to get information.
Take, for example, the reputation of doctors. This is one of the most interesting examples that I like to cite. Everybody, and I don't know if it's the same in the United States, but it is surely a fact in France and in Italy that if you ask someone about his or her doctor, he will reply that this is the best doctor in town. Everybody has the best doctor, which is clearly paradoxical because we can't all have the best doctor. The way in which we select doctors is very mysterious, because you don't have explicit ratings of doctors. You have websites now that rate the doctors, but health is a very sensitive issue, and you give trust to someone for many, many different reasons. But in the end, everybody ends up being convinced they have the best doctor.
I try to understand why. What are the good things? What are the heuristics? What are the biases that make us react in this way? When you are in a weak position, you attribute a higher weight to authority, which should be the opposite. If you're in a weak position, like it's your health, you should be more careful. The routine that we use is exactly the opposite, and there are many of these biases that we use everyday in order to allocate authority to some sources of information.
The proximity bias is something that I have studied. Just because someone is next to someone else, he receives the reputation of the more important person, which leaks and illuminates and enlightens the other person. There is a halo effect of transmission of authority from one person to another. Of course you can justify in some ways why two people who are next to each other must share authority, but in many cases this can bring us to negative conclusions.
That is basically what interests me—the double question of understanding our own biases, but also understanding the potential of using this indirect information and these indirect cues of quality of reputation in order to navigate this enormous amount of knowledge. What is interesting about Internet, and especially about the Web, is that Internet is not only an enormous reservoir of information, it is a reputational device. It means that it accumulates tons of evaluations of other people, so the information you get is pre-evaluated. This makes you go much faster. This is an evolutionary heuristic that we have, probably since the birth of the human mind.
Follow the people who know how to treat information. Don't go yourself for the solution. Follow those who have the solution. This is a super strong drive—to learn faster. Children know very well this drive. And of course it can bring you to conformism and have very negative side effects, but also can make you know faster. We know faster, not because there is a lot of information around, but because the information that is around is evaluated; it has a reputational label on it.
My interest in social epistemology is related to my previous interest in cognitive science, philosophy of cognitive science, and cognitive epistemology. It is just a transition, because the collective dimension of our knowledge is so huge that you need to know how social structures are organized. Our mind is a piece of a puzzle in which there are many minds that are connected in many different ways The way in which we are connected is not like in a brain—neuron connected to a neuron—we are connected through social networks. That was just an empirical evidence.
When Internet started in the '90s there were a lot of metaphors around what Internet is. What is this bundle of links? What is the connectivity? One was the brain. Maybe we can use the connectionist models of the brain in order to understand how Internet works, and how it develops, because it was growing, growing, growing. That was a possibility we were all very excited about, because we were working in cognitive science and neuroscience, philosophy of neuroscience. In 2000 the mathematician Jon Kleinberg had a result, which was published in Science. He showed that Internet is a social network like the social network of people we invite to cocktail parties, or social networking in our work environment.
We live in many different social networks and he showed—in 1999 actually, the article was out in 2000—that Internet is a social network. In those years, Brin and Page were students in Stanford, they took his idea to design PageRank, which was designed exactly as a social network. You have three levels, you have the older possible nodes of the network, let's say, the websites, then you have authorities, which are a middle layer of nodes that are more authoritative, which doesn't mean in terms of social network that it receives a lot of interest. These authorities point to some of the website, and they make them go up.
What is special about this seems very banal, but the structure is the structure of the social network. What was important for them was to explain that there is a huge difference in terms of authority and ranking between a link from the webpage of Gloria Origgi to the webpage of Harvard University and vice versa. If Harvard, which is an authority, points toward me, it makes me go up a lot. If I point to Harvard, my weight is not enough to make such a difference. This asymmetry of the network doesn't exist in the brain exactly. That's why I decided in early 2000 to move towards social sciences and try to take them seriously. When people discuss trust, even in philosophy etc., I wanted to understand how the social scientists treat this notion, the research around it.
I think I was right. We are facing this phenomenon, a big change in the business life in which trust, reputation, become commodities that we can exchange and trade, like tech sites, websites like eBay or Airbnb. It's sort of reputational. You have to have a good reputation in order to take part of this exchange of houses, apartments, etc., so this is important. But what does it mean?
What does it mean to have a good reputation? What does it mean in general to have a reputation? What do I lose? It's not like having a toothache. What do I lose when I lose a reputation? What did Volkswagen lose when it apparently lost its reputation a month ago because of a scandal about the construction of some of the devices in order to trick ecological controls? What did Volkswagen lose? What is reputation? Is it only social information, as it seems to be on websites like eBay?
On eBay reputation is purely social information. You gather evaluations from the social group and it changes your position in this social dynamic—makes you go up and down. But is it the same with the "likes" for example? What does it mean to "like" something on Facebook? How can you count the likes? What are you doing when you're trying to retouch your pictures on Facebook, or try to have a better reputation than your actual one online? How is it important today to have an e-reputation, to have a good reputation? Very few people just try to understand what's there. Reputation doesn't exist. It's not like money. It's not like headaches. It just doesn't exist. It exists because someone attributes you to a reputation. It is a social property. Philosophers talk about properties that can change things in your life, but they don't exist. You can also not be aware about having or not having a reputation. We are becoming super aware of our own reputation, and we are excited by the fact that we have so many devices today that can leave us a freedom of manipulation of reputation.
We can manipulate our own reputation by what? By using in a strategic way, let's say, the social web. On the other hand, the more freedom we have in dealing with our own reputation, the more freedom other people have to manipulate our reputation, to do things with what we have done.
There is very little literature in social sciences about reputation, and I have tried to go through and understand where reputation was. You have the notion of social capital, which is a little related to it, but it is not exactly the same. The problem with reputation is that it is not only the opinion of others, it is what you think the opinion of others is. It's something that it is a little more complex.
Many phenomena that are interesting today deal with how our own minds deal with the social world, so you have to be aware of how our minds are structured, but also how the social world is structured. What is special about reputation in this era? In this particular social configuration of societies that are invaded by social networks and technology of communication? It is that reputation is a communicative phenomenon, it is something that we transmit.
We find an article of something that we feel can be interesting, and instead of reading it we just forward to someone else, or put it on Facebook. The willingness to communicate, to transmit information, is stronger than the willingness to acquire information, which is also a special phenomenon. You should say, I am a self-interested guy, or girl—sort of an instance of the homo economicus—so I find a piece of interesting information, well I keep it for me! Why should I share? We all know today the common behavior is if you find something interesting you share it before reading it.
What does it mean in terms of cognitive science and social science? It means that some social configurations that are around today, that are made by social institutions, technologies, etc., may highlight some dispositions of our brain in a special way. Probably we have this disposition of sharing information immediately which was less exploited by other social configurations, like the way in which knowledge and information used to circulate fifty years ago, and which is highlighted and made more explicit by this social configuration.
Things are there in our mind, probably all the competence we have in order to deal with the information around are there. Some social configurations which are complex devices that put together politics, economics, the way in which social institutions stabilize, cultural phenomena, etc., technology, the development of technology.
These complex devices change from one generation to another. Some are more stable, some change very rapidly. In a sense they can foster some of our cognitive competences, and conceal some others for a while and then change. That's why I'm so interested in the interplay between these two dimensions.
The communication between cognitive sciences and social sciences is a little easier today because of the adoption by a group of social scientists of experimental methods, which is something that was not around in some other traditions of social sciences. But still, I came from philosophy studies and from humanities studies. All the possible cultural traditions, we have an enormous privilege in these times. Everything is available and everything can be re-mixed, every part of the culture, from literature to mathematics and astrophysics, and pop music. It can be repacked in a new way in order to see phenomenon that you cannot see within only one discipline.
I'm attached to a multi-pluralistic method of work, and this is the way in which I work. I also write literature in Italy. I'm attached to this creative dimensional mixing up of traditions and things. Today, the way in which a part of social sciences tries to tackle some phenomena is more compatible with the experimental tradition, so it is easy to communicate.
There are traditions in social sciences like economics, and part of sociology, which are highly formalized and that use models, so this is not just poetry. Models can predict something. But what I find interesting today is that you can talk to people who use models and try to say, "Well, this is an interesting prediction, can we just test it experimentally?" This is something that is also creative; it mixes up so many different competences. Take what I'm working on these days. I have the feeling—this is an intuition, and of course I have written about it—that we tend to evaluate better, to give a better reputation to people who reciprocate a little bit, to people who like us, at least a little bit.
Pure reputational free riders who want to go up in the hierarchies and never reciprocate, we are suspicious of them. We can adore them, but at a certain point we let them down. There are models in social sciences that formalize this way in which hierarchies are created. Why does a person go up in a hierarchy? Because many people defer to this person. The idea is that you have many people below, and one person up. If you're an important person in Twitter you have many followers, but you don't follow so many people. The idea that is around in some literature, and I want to test, is you prefer those who reciprocate a little bit, you defer to them, but they also defer a little bit. This is an adjustment that we do.
For example, when I try to submit a paper to the most high-ranked academic journal in philosophy and I am rejected, so I submit a second time, and my paper is rejected, at a certain point I do what? As we all do in the academic business, you try with a less prestigious journal. At a certain point down in the hierarchy you find a journal that publishes you. At this point sometimes people start to adjust and say, "Well, this journal is not so bad as people say. In a sense, it's much more accessible. The standards of those guys up there are becoming a little obsolete." That's a way in which hierarchies are adjusted. We adjust a lot in order to prefer who we think are going to prefer us.
This is something that exists in some formal model of social sciences. It exists in the formal study of hierarchies. It hasn't been tested. I'm just testing it with a student. There is now a small tradition of experimental philosophers, and in my lab, hopefully in Paris, there are many experimental philosophers—people who take philosophical questions and try to test them experimentally. We have a lot of fun. What is important to say, and this is the spirit of the experimental work, sometimes you're wrong. Sometimes your intuitions are completely wrong. Maybe I'm wrong. Maybe now we are testing it, we will discover that people just don't care to be reciprocated. And that is important, because we also have this self-evident authority of scientists. For one positive result that you find published in an academic journal you have tons of negative results that are in the graveyards of our labs that are very interesting too.
My approach is, don't think that the truth is only in a special place. The more modest you are and the more pluralistic you are, the better it is for research. That's the way I avoid any possible prejudice. That is also because of my background. I have a very mixed background. I have studied formal logic and Nietzsche and Husserl in my undergraduate studies. Then I studied cognitive science in Paris. It's very important to keep your mind open.
I write literature and I write a lot for newspapers in Italian because it was my way to keep contact with my own mother tongue. Europe is a very different situation from United States. We are twenty-seven countries with twenty-seven languages. I work sometimes in Brussels as an expert in advisory boards. In order to start a conversation you have to wait for the greetings in twenty-seven different languages. Europe is a very different scene in which trying to keep your difference is something that is important for us. It's like keeping your tradition. We pay an enormous price in Europe for protecting our folklore in a sense.
I have studied humanities and formal logic in my youth, when I was at university in Milano, which was a very good place. We didn't have this idea that there was a science on one side and cultural studies on the other side. We thought we would have been able to navigate the culture and navigate science with the same boat. It happened to be very difficult even in Europe where cultural studies have been a bigger disappointment, in a sense, a vulgarization of the values of humanities, which I still think is important in education. On the other hand, scientific methods were perceived as a tool of domination. A tool of dominating the discourse—the public, the scientific, the discourse about truth.
There is a famous piece on this, which was the inaugural lecture of Michel Foucault at the College De France, which is called "The Order of Discourse" in which he tries to argue that you should be careful when you use a certain discourse. It can be the scientific discourse, literature discourse, political discourse, who is controlling it in a sense. The scientific discourse was seen not as a tool for thought, but a tool for domination. That was also because the relationship between power and science has changed a lot in the last twenty years. We should be frank about this. This in Europe, I don't know what the scene is in the United States, but we have a political dimension of research. What we researchers are doing has become important. States display their reputation through their scientific research. They want to be present in the international rankings. They compete for prestige and for innovation.
We were a community twenty or thirty years ago in an ivory tower in which you do your research, and using the scientific method was something very independent of the real world. Now we are pushed on producing some research that is useful, and that has changed a little bit the vision and the positive vision of science as an activity of freedom that researchers used to have twenty years ago. The political dimension, at least in Europe, the way in which the political discourse has entered the scientific discourse, has changed a little bit the atmosphere.
But even if serious people hate post-modernism, because it was laissez-faire—and I don't want to defend post-modernism here—I just want to say something provocative about this, that we are in a post-modern era. Our knowledge is structurally post-modern, which means that there are no more clear filters. There are no more clear canons that establish what is high and what is low. This gives us an enormous cacophony of possible ways of dealing with culture and knowledge but also an enormous freedom, freedom of methods. You can find things, you can mix literature with big data. We have a freedom that we didn't have twenty or thirty years ago.
In a sense we are more uncertain about the results of research, but, at least in the intellectual debate, we should avoid polarization. What does that mean? The political debate is so polarized. In intellectual debate we should be pluralistic. Let's be pluralistic. We can access knowledge in so many different ways, we don't know which is going to be the one that is going to change the world.