Book cover of Superforecasting by Philip E. Tetlock

Philip E. Tetlock

Superforecasting Summary

Reading time icon12 min readRating icon4.1 (20,576 ratings)

Good forecasting isn't a gift but a set of skills anyone can develop with practice and the right mindset.

1. Forecasting is a Skill, Not a Superpower

Many people think forecasting is a domain reserved for experts or those with innate talent. But Superforecasting argues that anyone can develop this ability through practice, analysis, and thoughtful methodologies. Forecasting is part of daily life, whether you’re deciding on a career move, investments, or traveling plans.

Superforecasters demonstrate how deliberate effort leads to dramatic improvement in prediction skills. For example, Bill Flack, a retiree with no formal training as a forecaster, has become renowned for his precise predictions about global events. His success demonstrates that expertise isn't about titles or roles but about practices.

Another key revelation from the research behind the Good Judgment Project (GJP) is that superforecasting skills are teachable. Participants, like Flack, have consistently shown that success depends on adopting the right mindset and techniques, not prior experience. This dispels common myths that forecast accuracy is largely innate.

Examples

  • Bill Flack learned and practiced forecasting techniques, outperforming 98% of other volunteers in the GJP.
  • IARPA's forecasting tournament revealed ordinary people could make predictions 60-78% more accurate than most experts when taught specific methods.
  • The GJP’s participants came from diverse careers, including housewives and engineers, proving that anyone could excel with effort.

2. Approach Questions with Structured Thought

Superforecasters excel by logically breaking down complex questions into smaller components. They resist making hasty, instinct-driven judgments. Instead, they meticulously analyze various factors to ensure objectivity in their thinking.

For instance, when tasked with predicting whether evidence of poisoning would be found in Yasser Arafat's remains, superforecasters first verified scientific feasibility. Only afterward would they evaluate geopolitical motives. This systematic approach helped avoid bias and anchored predictions in evidence-based reasoning.

This habit of question deconstruction applies to a range of challenges, from global events to everyday problems. By addressing queries as a combination of smaller parts, forecasters ensure no critical angle is ignored.

Examples

  • Superforecasters investigated polonium's detectability before discussing political implications in assessing Arafat’s poisoning.
  • Complex global decisions, like predicting Arctic ice levels, were tackled in manageable steps to incorporate distinct elements, such as seasonal trends and scientific reports.
  • The split-question technique helped evaluate contradictory factors, like South Africa's need to balance its decisions regarding the Dalai Lama and China.

3. Avoid Bias by Welcoming Contradictory Perspectives

While most people seek evidence that supports their beliefs, superforecasters go out of their way to find contradicting viewpoints. This curiosity allows them to challenge their assumptions and refine their predictions.

For instance, when examining a politically sensitive topic, they rephrase questions to counteract ingrained biases. In the Dalai Lama visa scenario, instead of asking, “Will South Africa allow it?” they flipped it to “Will South Africa deny it?” This prompted consideration of why the government might refuse such a visa, broadening their perspective.

This unbiased mindset is also seen in how they actively gather varied opinions or feedback. Rather than defending their original stance, they remain open to new arguments, treating predictions as hypotheses to be tested and improved.

Examples

  • Superforecasters rewrote the Dalai Lama question to consider both granting and denying the visa, avoiding one-sided analysis.
  • Forecasts on Russian territorial moves factored in motivations for annexing land and reasons not to, reducing tunnel vision.
  • Collaborative discussions in teams helped superforecasters refine ideas, identify weak spots, and balance perspectives.

4. Embrace Probabilistic Thinking

Instead of viewing outcomes as “definitely going to happen” or “definitely not,” superforecasters think in probabilities. They factor uncertainty into their decisions, weighing each possibility without overconfidence or hesitation.

President Obama’s handling of Osama bin Laden intelligence illustrates this philosophy. Faced with estimates ranging from 30% to 95%, he declared the certainty felt like a “fifty-fifty” choice. While his phrasing seemed ambiguous, it highlighted his acceptance of uncertainty and judgment under risk.

By avoiding absolute conclusions, superforecasters improve both accuracy and adaptability. Probabilistic thinking forces them to respond dynamically to updates instead of clinging to one-sided assumptions.

Examples

  • Obama’s balanced approach in bin Laden discussions respected diverse probability estimates, facilitating clearer decisions.
  • Robert Rubin clarified differences between an 80% chance and full certainty, preventing oversimplification in crucial financial matters.
  • Superforecasters assigned nuanced percentages, such as “20%” or “70%,” to predictions instead of relying on vague terms like “maybe.”

5. Continuously Update Predictions

Superforecasters don't stick stubbornly to their original judgments. Instead, they alter forecasts whenever new information emerges. This practice ensures accuracy evolves over time.

Bill Flack serves as a striking example. When predicting whether polonium would be found in Yasser Arafat's remains, he frequently revised his forecasts after seeing fresh news, which made his judgments more accurate over the long term.

On the other hand, failing to adjust predictions can lead to errors. In one instance, Flack misjudged Japan’s prime minister Shinzo Abe's visit to a controversial shrine because he ignored insider hints. This underscores the importance of refining beliefs incrementally.

Examples

  • Bill Flack adjusted his forecast multiple times until his prediction regarding Arafat's remains aligned with verification timelines.
  • Unexpected events, like Abe’s shrine visit, highlighted how insufficient updates can backfire.
  • Monitoring Arctic sea ice trends, Doug Lorch revised predictions to better align with recently released reports.

6. Beware of Overreacting to New Data

While updating forecasts is critical, overreacting to incoming information can be just as harmful. Superforecasters must weigh data carefully, determining its relevance before making changes.

For example, Doug Lorch revised his prediction for Arctic ice too heavily based on outdated research. As a result, his forecast diverged from the actual outcome, highlighting how bias can arise from excessive trust in unverified details.

Learning to balance reactions to data sharpens judgment. It’s easy to overcorrect due to emotionally charged or sudden information, but accuracy depends on measured, evidence-driven updates.

Examples

  • Lorch’s premature Arctic ice correction showed the risks of relying too much on incomplete sources.
  • Political debates involving new controversies often test overreaction tendencies, ranging from climate policy to global economic shifts.
  • News releases predicting countries’ military behavior pushed forecasters to recalibrate without losing objectivity.

7. Actively Seek Feedback to Improve

Feedback differentiates experienced decision-makers from inconsistent ones. Professions like meteorology thrive on constant feedback loops, allowing workers to refine methodology and advance their accuracy.

In contrast, police officers judging suspects often fail due to lacking follow-up evaluations about incorrect assumptions. Superforecasters intentionally design their practices to secure consistent input and correct their weaknesses, much like a pilot fine-tuning their flight performance.

This ongoing self-assessment habit ensures superforecasters maintain sharpness and adapt to challenges over time.

Examples

  • Meteorologists saw improved weather predictions after reviewing past successes and failures.
  • Bridge players excelled with instant game result reviews, sharpening strategies for tournaments.
  • Superforecasters in the GJP used detailed reviews of forecasting tournaments for consistent skill development.

8. Growth Mindset Enables Progress

Superforecasters embody a growth mindset, viewing forecasting as a skill that improves with learning, practice, and perseverance. Economist Mary Simpson, for instance, used the failure of missing the 2007 financial crisis as motivation to rebuild her prediction abilities.

This mindset starkly contrasts with a fixed mindset, where individuals believe talents are static and untouchable. As psychologist Carol Dweck outlines, embracing room for growth fosters better resilience, learning, and adaptability.

Superforecasters focus on refining strategies, not lamenting failures. Every wrong prediction serves as an opportunity to train, adapt, and try again.

Examples

  • Mary Simpson’s challenges during the financial crisis led her to become a top participant in the GJP.
  • Economist John Maynard Keynes learned from massive investment losses, emerging as a more skilled theorist and investor.
  • Superforecasters treated errors like stepping stones rather than setbacks, showing consistent improvement.

9. Practice Enhances Mastery

Every master, whether a pilot, athlete, or forecaster, improves through deliberate practice. But practice must be paired with thoughtful evaluation to uncover blind spots and opportunities for improvement.

For example, while a novice might misunderstand bike mechanics despite reading about physics, actual practice with guidance teaches balance. Superforecasting isn't immune to this rule – it requires firsthand experience, continuous analysis, and trial-and-error learning.

Performance improves with aligned effort and the discipline to pair mistakes with optimized correction strategies.

Examples

  • Pilots use simulators to refine flight techniques through hands-on scenarios and adjustments.
  • Policemen misjudge suspect interrogation techniques without proactive adjustments, missing key insights.
  • Participating superforecasters continually self-correct after observing gaps between prediction and reality.

Takeaways

  1. Break big decisions into smaller, bite-sized questions and answer each one methodically before forming conclusions.
  2. Build the habit of revisiting your predictions regularly with new information, and keep a clear record of updates and adjustments.
  3. Actively seek diverse perspectives and feedback from others to challenge your beliefs and refine your understanding.

Books like Superforecasting