All Videos

The Genomic Ancient DNA Revolution

A New Way to Investigate the Past
[2.1.16]

My experience collaborating with Svante since 2007, has been that the data we’ve looked at from the incredible samples they have has yielded surprise after surprise. Nobody had ever gotten to look at data like this before. First, there were the Neanderthals, and then there was this pinky bone from Southern Siberia. At the end of the Neanderthal project, Svante told me we have this amazing genome-wide data from another archaic human, from a little pinky bone of a little girl from a Southern Siberian cave, and asked if I'd like to get involved in analyzing it.

When we analyzed it, it was an incredible surprise: This individual was not a Neanderthal. They were in fact much more distantly related to a Neanderthal than any two humans are today from each other, and it was not a modern human. It was some very distant cousin of a Neanderthal that was living in Siberia in Central Asia at the time that this girl lived.

When we analyzed the genome of this little girl, we saw that she was related to people in New Guinea and Australia. A person related to her had contributed about 5 percent of the genomes to people in New Guinea and Australia and related people—an interbreeding event nobody had known about before. It was completely unexpected. It wasn’t in anybody’s philosophy or anybody’s prediction. It was a new event that was driven by the data and not by people’s presuppositions or previous ideas.

This is what ancient DNA does for us. When you look at the data, it doesn’t always just play into one person’s theory or the other; it doesn’t just play into the Indo-European steppe hypothesis or the Anatolian hypothesis. Sometimes it raises something completely new, like the Denisovan finger bone and the interbreeding of a gene flow from Denisovans into Australians and New Guineans. 

DAVID REICH is a geneticist and professor in the Department of Genetics at the Harvard Medical School. David Reich's Edge Bio Page


Go to stand-alone video: :
 

The Crusade Against Multiple Regression Analysis

[1.21.16]

A huge range of science projects are done with these multiple regression things. The results are often somewhere between meaningless and quite damaging. ...                             

I hope that in the future, if I’m successful in communicating with people about this, that there’ll be a kind of upfront warning in New York Times articles: These data are based on multiple regression analysis. This would be a sign that you probably shouldn’t read the article because you’re quite likely to get non-information or misinformation.

RICHARD NISBETT is a professor of psychology and co-director of the Culture and Cognition Program at the University of Michigan. He is the author of Mindware: Tools for Smart Thinking; and The Geography of Thought. Richard Nisbett's Edge Bio Page


 

What is Reputation?

[11.5.15]

That is basically what interests me—the double question of understanding our own biases, but also understanding the potential of using this indirect information and these indirect cues of quality of reputation in order to navigate this enormous amount of knowledge. What is interesting about Internet, and especially about the Web, is that Internet is not only an enormous reservoir of information, it is a reputational device. It means that it accumulates tons of evaluations of other people, so the information you get is pre-evaluated. This makes you go much faster. This is an evolutionary heuristic that we have, probably since the birth of the human mind.                                             

Follow the people who know how to treat information. Don't go yourself for the solution. Follow those who have the solution. This is a super strong drive—to learn faster. Children know very well this drive. And of course it can bring you to conformism and have very negative side effects, but also can make you know faster. We know faster, not because there is a lot of information around, but because the information that is around is evaluated, it has a reputational label on it. 

GLORIA ORIGGI is a researcher at the Centre Nationale de la Recherche Scientifique in Paris and a journalist. She is a best-selling novelist in the Italian language, a respected philosopher in French, a cognitive scientist in English, and the person you want to sit next to at a dinner party. Her latest book, La Reputation, was recently published in France. Gloria Origgi's Edge Bio Page


Go to stand-alone video: :
 

Choosing Empathy

[10.20.15]

If you believe that you can harness empathy and make choices about when to experience it versus when not to, it adds a layer of responsibility to how you engage with other people. If you feel like you're powerless to control your empathy, you might be satisfied with whatever biases and limits you have on it. You might be okay with not caring about someone just because they're different from you. I want people to not feel safe empathizing in the way that they always have. I want them to understand that they're doing something deliberate when they connect with someone, and I want them to own that responsibility.

JAMIL ZAKI is an assistant professor of psychology at Stanford University and the director of the Stanford Social Neuroscience Lab.  Jamil Zaki's Edge Bio Page


Go to stand-alone video: :
 

Edge Master Class 2015: A Short Course in Superforecasting, Class V

Condensing it All Into Four Big Problems and a Killer App Solution
[9.22.15]

The beauty of forecasting tournaments is that they’re pure accuracy games that impose an unusual monastic discipline on how people go about making probability estimates of the possible consequences of policy options. It’s a way of reducing escape clauses for the debaters, as well as reducing motivated reasoning room for the audience.

Tournaments, if they’re given a real shot, have a potential to raise the quality of debates by incentivizing competition to be more accurate and reducing functionalist blurring that makes it so difficult to figure out who is closer to the truth. 


[29:26 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class V

Condensing it All Into Four Big Problems and a Killer App Solution
[9.22.15]

The beauty of forecasting tournaments is that they’re pure accuracy games that impose an unusual monastic discipline on how people go about making probability estimates of the possible consequences of policy options. It’s a way of reducing escape clauses for the debaters, as well as reducing motivated reasoning room for the audience.

Tournaments, if they’re given a real shot, have a potential to raise the quality of debates by incentivizing competition to be more accurate and reducing functionalist blurring that makes it so difficult to figure out who is closer to the truth. 


[24:43 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class IV

Skillful Backward and Forward Reasoning in Time: Superforecasting Requires "Counterfactualizing"
[9.15.15]

A famous economist, Albert Hirschman, had a wonderful phrase, "self-subversion." Some people, he thought, were capable of thinking in self-subverting ways. What would a self-subverting liberal or conservative say about the Cold War? A self-subverting liberal might say, "I don’t like Reagan. I don’t think he was right, but yes, there may be some truth to the counterfactual that if he hadn’t been in power and doing what he did, the Soviet Union might still be around." A self-subverting conservative might say, "I like Reagan a lot, but it’s quite possible that the Soviet Union would have disintegrated anyway because there were lots of other forces in play."

Self-subversion is an integral part of what makes superforecasting cognition work. It’s the willingness to tolerate dissonance. It’s hard to be an extremist when you engage in self-subverting counterfactual cognition. That’s the first example. The second example deals with how regular people think about fate and how superforecasters think about it, which is, they don’t. Regular people often invoke fate, "it was meant to be," as an explanation for things.


[33:47 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class IV

Skillful Backward and Forward Reasoning in Time: Superforecasting Requires "Counterfactualizing"
[9.15.15]

A famous economist, Albert Hirschman, had a wonderful phrase, "self-subversion." Some people, he thought, were capable of thinking in self-subverting ways. What would a self-subverting liberal or conservative say about the Cold War? A self-subverting liberal might say, "I don’t like Reagan. I don’t think he was right, but yes, there may be some truth to the counterfactual that if he hadn’t been in power and doing what he did, the Soviet Union might still be around." A self-subverting conservative might say, "I like Reagan a lot, but it’s quite possible that the Soviet Union would have disintegrated anyway because there were lots of other forces in play."
        
Self-subversion is an integral part of what makes superforecasting cognition work. It’s the willingness to tolerate dissonance. It’s hard to be an extremist when you engage in self-subverting counterfactual cognition. That’s the first example. The second example deals with how regular people think about fate and how superforecasters think about it, which is, they don’t. Regular people often invoke fate, "it was meant to be," as an explanation for things.


[34:14 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class III

Counterfactual History: The Elusive Control Groups in Policy Debates
[9.1.15]

There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.

Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who’s one of the pivital people in economic history; he won a Nobel Prize. But there’s this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring. 


[46:31 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Edge Master Class 2015: A Short Course in Superforecasting, Class III

Counterfactual History: The Elusive Control Groups in Policy Debates
[9.1.15]

There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.

Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who’s one of the pivital people in economic history; he won a Nobel Prize. But there’s this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring. 


[31:07 minutes]

PHILIP E. TETLOCK is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, the author of Expert Political Judgment and (with Aaron Belkin) Counterfactual Thought Experiments in World Politics, and co-author (with Dan Gardner) of Superforecasting: The Art & Science of Prediction (forthcoming, September 2015). Philip Tetlock's Edge Bio Page.


 

Pages