“How can we make better decisions in an uncertain world? By thinking in bets, we are forced to embrace uncertainty and assess the probability of different outcomes.”

1. Decision Quality vs. Outcome Quality Aren’t the Same

Many people tend to equate the quality of a decision with the quality of its outcome, but they’re not the same thing. A great decision can lead to failure, and vice versa, due to factors beyond our control. Take the example of Seattle Seahawks coach Pete Carroll in Super Bowl XLIX. His decision to pass instead of running the ball was logical based on the game’s context but resulted in a loss and immense public backlash when the ball was intercepted.

It’s natural for people to focus on the result without considering the facts at hand during the decision-making process. Life and poker share this in common: they are both probabilistic games. Since outcomes are influenced by both skill and luck, isolating the role of decision-making from these factors can be challenging but necessary for improvement.

Recognizing this distinction empowers us to accept uncertainty as part of life. When we stop asking if a decision was “right” or “wrong” and instead consider “how reasonable it was given the circumstances,” we can learn from our decisions rather than unfairly judging ourselves.

Examples

  • Pete Carroll’s controversial football play decision shows how good logic can lead to bad results.
  • People often see not crashing after driving drunk as evidence the decision to drive intoxicated was sound.
  • A poker game where a 24% chance to win ends up succeeding; bad odds don't make the initial decision wrong.

2. Our Brains Are Hardwired to Believe First, Scrutinize Later

Evolution has wired humans to initially accept information as true, rather than question it. Early survival depended on quick reactions—such as fleeing from potential predators—rather than objective analysis. This influences how we form beliefs today, as initial impressions are rarely subjected to scrutiny unless we actively choose to question them.

Daniel Gilbert’s 1993 study at Harvard demonstrated how easily people make mistakes in belief evaluation, especially under mental strain. Participants were more prone to accept false messages as truth when cognitively overloaded. This automatic trust in information can lead us astray when making decisions without evidence or deeper thought.

Challenging this instinct takes effort. Asking “Wanna bet?” about our beliefs forces us to reexamine them critically. When stakes are introduced, we become more careful in evaluating information, seeking coherence and evidence to back our claims.

Examples

  • Evolution made humans prioritize quick reactions, such as running from rustling in tall grass, rather than debating its source.
  • Gilbert’s cognitive study revealed participants believed false claims under mental duress.
  • Betting money on a statement’s truth encourages deeper thinking compared to casual agreement.

3. Not All Outcomes Are Equally Teachable

Outcomes have the potential to shape our future decisions, but determining whether a result was primarily due to luck, skill, or uncontrollable variables is not straightforward. Failing to pinpoint the right cause might lead us to draw the wrong lessons from past experiences.

In poker, players analyze outcomes to determine whether their strategy led to the loss or if luck played a bigger role. Similarly, self-serving bias can hinder learning. When people experience positive results, they often attribute success to skill, while negative outcomes get blamed on external factors. This behavior blocks self-improvement.

To learn effectively, it’s best to evaluate outcomes with objectivity. Removing ourselves from the emotion of failure or success provides clarity. It can also help us stop blaming others, as illustrated by accident reports where people disproportionately claimed innocence even when they were solely at fault.

Examples

  • A poker player analyzes her strategy after a loss to determine if her decision-making was flawed.
  • Researchers found 91% of drivers in multi-vehicle accidents blamed others, reinforcing self-serving bias.
  • Cubs fan Steve Bartman was blamed unfairly for a deflected ball, an outcome largely due to luck.

4. New Habits Help Us See Results Objectively

Habits dictate our responses to our wins and losses. Understanding and changing these responses can foster more objective learning. The key lies in altering the routine part of our habits while keeping the triggers and rewards intact.

Top poker players like Phil Ivey exhibit an ability to objectively analyze their performances, even their successes, rather than indulging in self-congratulatory narratives. Celebrating truth-seeking behaviors—like learning from mistakes—rewards critical thought instead of validating unexamined decisions.

To change how we reflect on outcomes, cultivating habits that promote objective analysis is crucial. Asking, “What could I have done differently?” instead of basking in self-praise is one way to counter biased habits and grow as decision-makers.

Examples

  • Phil Ivey analyzed his major wins to identify potential areas for improvement.
  • Self-praise habits in poker can be redirected toward good self-evaluation.
  • Reflecting on car accidents as potentially preventable, instead of blaming “bad luck.”

5. Decision-Making Groups Bring Perspective

Groups are powerful tools for decision-making when they are built around truth-seeking, objectivity, and constructive critique. However, they need specific rules to avoid becoming echo chambers that reinforce existing biases.

The author credits a poker group for helping her improve her strategies. Their shared commitment to learning and a no-complaining rule forced participants to focus on what could be analyzed and improved. Over time, this strengthened her ability to confront her biases.

Groups should encourage dissent and diverse ideas to refine their learning process. Organizations like the CIA even create “red teams” to purposefully challenge mainstream ideas, enhancing the accuracy of their assessments.

Examples

  • Annie Duke’s early poker group valued objectivity and limited conversations to analyzable moments.
  • The CIA’s “red teams” challenge assumptions, preventing overconfidence.
  • Differing perspectives helped the author recognize and overcome biases outside her group.

6. Adopt the Group Standard of CUDOS

Merton R. Schkolnick introduced the concept of CUDOS: Communism, Universalism, Disinterestedness, and Organized Skepticism. These principles are vital for collaborative decision-making groups.

In CUDOS, communism ensures transparency: everyone must share all relevant information, even if it’s unflattering. Universalism demands all ideas be evaluated equally, no matter their origin. Avoiding bias through disinterestedness ensures that analyzing decisions is detached from personal gain. Organized skepticism maintains sharp, critical reasoning without devolving into arguments.

The poker group used these guidelines to foster openness and impartial analysis, creating a learning environment focused on truth and improvement.

Examples

  • Sharing mistakes in poker groups aligns with the communism principle.
  • Universalism prevents dismissing strategies from “bad players,” encouraging learning.
  • Examining past games without knowing the outcome mimics disinterestedness.

7. Imagine the Future to Improve the Present

Temporal discounting causes us to prioritize short-term desires over long-term outcomes. This mental bias often leads us to regret snap decisions, like procrastinating on work or overindulging in unhealthy habits.

Visualizing future consequences can help. Techniques like Suzy Welch’s “10-10-10” (reflecting on how a decision feels 10 minutes, 10 months, and 10 years later) allow people to weigh immediate benefits against future satisfaction.

Tools like backcasting and premortems guide forward-thinking planning. Backcasting aligns actions with goals by imagining successful outcomes and identifying the necessary steps, while premortems mitigate failure by imagining paths that could result in missteps.

Examples

  • Jerry Seinfeld’s humor described future regret as feuding with another version of himself.
  • Suzy Welch’s “10-10-10” reframes decisions through future accountability.
  • Though premortems emphasize failure, NYU research finds they improve success rates.

8. Confront Biases with Betting Language

At times, reframing a discussion into a “betting” scenario can diminish attachment to strong opinions. It pushes individuals to think rationally and consider counters to their arguments. This approach can be applied in many decision-making situations to foster open dialogue.

Examples

  • Instead of arguing, “Do you want to bet $10 this belief holds?”
  • Poker tables foster detachment through betting language.
  • Peer debates with 'bets' become more exploratory rather than confrontational.

Takeaways

  1. Reframe uncertain decisions as probabilities rather than absolutes of “right” or “wrong.”
  2. Use “Wanna bet?” questioning to rethink and rigorously evaluate beliefs.
  3. Practice mental contrasting by not only visualizing goals but highlighting obstacles ahead.

Books like Thinking in Bets