(ED. NOTE: In 2015, Edge presented "A Short Course in Superforecasting" with political and social scientist Philip Tetlock. Superforecasting is back in the news this week thanks to the UK news coverage of comments by Boris Johnson's chief adviser Dominic Cummings, who urged journalists to "read Philip Tetlock's Superforecasters [sic], instead of political pundits who don't know what they're talking about.")
PHILIP E. TETLOCK, political and social scientist, is the Annenberg University Professor at the University of Pennsylvania, with appointments in Wharton, psychology and political science. He is co-leader of the Good Judgment Project, a multi-year forecasting study, author of Expert Political Judgment, co-author of Counterfactual Thought Experiments in World Politics (with Aaron Belkin), and co-author of Superforecasting: The Art & Science of Prediction (with Dan Gardner). Further reading on Edge: "How to Win at Forecasting: A Conversation with Philip Tetlock" (December 6, 2012). Philip Tetlock's Edge Bio Page.
CLASS I — Forecasting Tournaments: What We Discover When We Start Scoring Accuracy
It is as though high status pundits have learned a valuable survival skill, and that survival skill is they've mastered the art of appearing to go out on a limb without actually going out on a limb. They say dramatic things but there are vague verbiage quantifiers connected to the dramatic things. It sounds as though they're saying something very compelling and riveting. There's a scenario that's been conjured up in your mind of something either very good or very bad. It's vivid, easily imaginable.
It turns out, on close inspection they're not really saying that's going to happen. They're not specifying the conditions, or a time frame, or likelihood, so there's no way of assessing accuracy. You could say these pundits are just doing what a rational pundit would do because they know that they live in a somewhat stochastic world. They know that it's a world that frequently is going to throw off surprises at them, so to maintain their credibility with their community of co-believers they need to be vague. It's an essential survival skill. There is some considerable truth to that, and forecasting tournaments are a very different way of proceeding. Forecasting tournaments require people to attach explicit probabilities to well-defined outcomes in well-defined time frames so you can keep score.
CLASS II — Tournaments: Prying Open Closed Minds in Unnecessarily Polarized Debates
Tournaments have a scientific value. They help us test a lot of psychological hypotheses about the drivers of accuracy, they help us test statistical ideas; there are a lot of ideas we can test in tournaments. Tournaments have a value inside organizations and businesses. A more accurate probability helps to price options better on Wall Street, so they have value.
I wanted to focus more on what I see as the wider societal value of tournaments and the potential value of tournaments in depolarizing unnecessarily polarizing policy debates. In short, making us more civilized. ...
There is well-developed research literature on how to measure accuracy. There is not such well-developed research literature on how to measure the quality of questions. The quality of questions is going to be absolutely crucial if we want tournaments to be able to play a role in tipping the scales of plausibility in important debates, and if you want tournaments to play a role in incentivizing people to behave more reasonably in debates.
CLASS III — Counterfactual History: The Elusive Control Groups in Policy Debates
There's a picture of two people on slide seventy-two, one of whom is one of the most famous historians in the 20th century, E.H. Carr, and the other of whom is a famous economic historian at the University of Chicago, Robert Fogel. They could not have more different attitudes toward the importance of counterfactuals in history. For E.H. Carr, counterfactuals were a pestilence, they were a frivolous parlor game, a methodological rattle, a sore loser's history. It was a waste of cognitive effort to think about counterfactuals. You should think about history the way it did unfold and figure out why it had to unfold the way it did—almost a prescription for hindsight bias.
Robert Fogel, on the other hand, approached it more like a scientist. He quite correctly recognized that if you want to draw causal inferences from any historical sequence, you have to make assumptions about what would have happened if the hypothesized cause had taken on a different value. That's a counterfactual. You had this interesting tension. Many historians do still agree, in some form, with E.H. Carr. Virtually all economic historians would agree with Robert Fogel, who's one of the pivital people in economic history; he won a Nobel Prize. But there's this very interesting tension between people who are more open or less open to thinking about counterfactuals. Why that is, is something that is worth exploring.
CLASS IV — Skillful Backward and Forward Reasoning in Time: Superforecasting Requires "Counterfactualizing"
A famous economist, Albert Hirschman, had a wonderful phrase, "self-subversion." Some people, he thought, were capable of thinking in self-subverting ways. What would a self-subverting liberal or conservative say about the Cold War? A self-subverting liberal might say, "I don’t like Reagan. I don’t think he was right, but yes, there may be some truth to the counterfactual that if he hadn’t been in power and doing what he did, the Soviet Union might still be around." A self-subverting conservative might say, "I like Reagan a lot, but it’s quite possible that the Soviet Union would have disintegrated anyway because there were lots of other forces in play."
Self-subversion is an integral part of what makes superforecasting cognition work. It’s the willingness to tolerate dissonance. It’s hard to be an extremist when you engage in self-subverting counterfactual cognition. That’s the first example. The second example deals with how regular people think about fate and how superforecasters think about it, which is, they don’t. Regular people often invoke fate, "it was meant to be," as an explanation for things.
CLASS V — Condensing it All Into Four Big Problems and a Killer App Solution
The beauty of forecasting tournaments is that they’re pure accuracy games that impose an unusual monastic discipline on how people go about making probability estimates of the possible consequences of policy options. It’s a way of reducing escape clauses for the debaters, as well as reducing motivated reasoning room for the audience.
Tournaments, if they’re given a real shot, have a potential to raise the quality of debates by incentivizing competition to be more accurate and reducing functionalist blurring that makes it so difficult to figure out who is closer to the truth.
In the circle of clairvoyants: At a vineyard north of San Francisco, Philip Tetlock of the University of Pennsylvania (left) presented his findings. Initially skeptical was Nobel Laureate Kahneman (third from left). Photo: John Brockman / edge.org
ATTENDEES:
Robert Axelrod, Political Scientist; Walgreen Professor for the Study of Human Understanding, U. Michigan; Author, The Evolution of Cooperation; Member, National Academy of Sciences; Recipient, the National Medal of Science; Stewart Brand, Founder, The Whole Earth Catalog; Co-Founder, The Well; Co-Founder, The Long Now Foundation; Author, Whole Earth Discipline; John Brockman, Editor, Edge; Author, The Third Culture; Rodney Brooks, Panasonic Professor of Robotics (emeritus), MIT; Founder, Chmn/CTO, Rethink Robotics; Author, Flesh and Machines; Brian Christian, Philosopher, Computer Scientist, Poet; Author, The Most Human Human; Wael Ghonim, Pro-democracy leader of the Tarir Square demonstrations in Egypt; Anonymous administrator of the Facebook page, "We are all Khaled Saeed"; W. Daniel Hillis, Physicist; Computer Scientist; Chairman, Applied Minds; Author, The Pattern on the Stone; Jennifer Jacquet, Assistant Professor of Environmental Studies, NYU; Author, Is Shame Necessary?; Daniel Kahneman, Professor Emeritus of Psychology, Princeton; Author, Thinking, Fast and Slow; Winner of the 2013 Presidential Medal of Freedom; Recipient of the 2002 Nobel Prize in Economic Sciences; Salar Kamangar, Senior Vice President, Google; Fmr head of YouTube; Dean Kamen, Inventor and Entrepreneur, DEKA Research; Andrian Kreye, Feuilleton Editor, Sueddeutsche Zeitung, Munich; Peter Lee, Corp. VP, Microsoft Research; Former Founder / Director, DARPA's technology office; Former Head, Carnegie Mellon Computer Science Department & CMU's Vice Provost for Research; Margaret Levi, Political Scientist, Director, Center For Advanced Study in Behavioral Sciences (CASBS), Stanford University; Barbara Mellers, Psychologist; George Heyman University Professor at UPennsylvania; Past President, Society of Judgment and Decision Making; Ludwig Siegele, Technology Editor, The Economist; Rory Sutherland, Executive Creative Director and Vice-Chairman, OgilvyOne London; Vice-Chairman, Ogilvy & Mather UK; Columnist, The Spectator; Philip Tetlock, Political and Social Scientist; Annenberg University Professor at UPenn; Author, Expert Political Judgment; and (with Dan Gardner) Superforecasting (forthcoming); Anne Treisman, James S. McDonnell Distinguished University Professor Emeritus of Psychology at Princeton; Recipient, National Medal of Science; D.A.Wallach, Recording Artist; Songwriter; Artist in Residence, Spotify; Hi-Tech Investor