Book cover of Not Born Yesterday by Hugo Mercier

Hugo Mercier

Not Born Yesterday

Reading time icon13 min readRating icon3.9 (386 ratings)

Humans aren’t easily tricked; their trust is earned through careful reasoning and comparison with their prior beliefs.

1. Beliefs align with personal goals and existing views

Humans naturally filter information through the lens of their own experiences and beliefs. When deciding what to trust, people generally lean toward ideas that align with their interests and values. This isn’t a matter of gullibility but deliberate selection.

For instance, propaganda efforts in history, such as those during the Nazi regime, didn't convert minds as widely as presumed. Studies showed that the regions most supportive of Nazi ideology already harbored anti-Semitic beliefs before exposure to propaganda. This suggests that rather than blindly absorbing propaganda, people’s pre-existing tendencies and views prime their reactions to persuasive efforts.

The “fax model” of cultural transfer, which assumes people mindlessly adopt all cultural cues, also falls short of reality. Cultural diversity and varying individual interpretations create differences even in seemingly uniform groups. For instance, if a group of 100 artists was asked to paint the same scene, no two interpretations would be identical, despite originating from the same source of information.

Examples

  • Nazi propaganda's failure to substantially influence anti-Semitism levels in neutral areas.
  • The fax model underestimating cultural variation in human societies.
  • Varied artistic outputs from the same prompts showcasing interpretive diversity.

2. Shared goals deter misinformation

In scenarios where individuals pursue common objectives, reliable communication naturally prevails. Shared aims discourage dishonesty, as being truthful serves everyone’s best interests.

Bees exemplify this dynamic. Through their waggle dance, bees convey the location of nectar-rich flowers. They trust each other's cues because all hive members depend on accurate sharing to ensure collective success. Similarly, a study observed bees still flying to a seemingly improbable destination—a feeder on a lake—based on their fellow bees’ signals about its location.

For humans, the cost of deception isn’t insignificant either. If a person routinely sends false signals, their credibility declines, weakening future opportunities for effective communication. Maintaining trust requires effort but serves long-term benefits for both parties.

Examples

  • Bees' waggle dance to guide hive mates to food for mutual survival.
  • Lake feeder experiments showing hive trust despite improbable information.
  • Continuous truth-telling improving cooperative systems in human interactions.

3. Vigilance mechanisms protect us from deception

Human cognition relies on vigilance to balance openness with scrutiny. Mechanisms like plausibility checks and reasoning allow individuals to assess the validity of messages.

Misunderstood concepts of the subconscious, like 1950s fears about brainwashing and subliminal advertising, suggested humans couldn’t resist persuasion. These claims, however, lacked strong evidence. Reality paints a different picture: when our cognitive resources are limited, we don't simply become gullible—we actually double down on skepticism and refuse to accept information easily.

This heightened vigilance is part of evolutionary communication. Just as computers must evaluate incoming data for viruses, humans have developed layers of cognitive processing to discern trustworthy from harmful signals.

Examples

  • Open vigilance mechanisms preventing us from believing subliminal messaging fears.
  • Increased skepticism when we're tired or distracted, showcasing innate protections.
  • Analogies between human cognition and malware detection in modern technology.

4. Prior beliefs guide our judgment

Humans use their existing beliefs as a framework for evaluating new information. If new data aligns with these frameworks, it’s more readily accepted. If not, it undergoes further scrutiny.

Take a debate on travel routes, for example. If someone suggests a subway when you already believe it’s unreliable during strikes, you need extra evidence, like news of a strike itself, to reconsider. This represents the cognitive process of plausibility checking—an automatic filter that reduces mental stress by dismissing unlikely possibilities.

Interactive reasoning follows as the next step, helping incorporate credible updates into one’s worldview. Group discussions benefit from this process, enabling collaborative validation of ideas and fostering more accurate outcomes.

Examples

  • Group debates demonstrating collective evaluation of ideas.
  • Decision-making processes adjusted when credible counterarguments are shared.
  • Plausibility filtering allowing quicker rejection of unlikely scenarios.

5. Intuition helps assess others' competence

Humans use a combination of intuition and past observations to gauge the credibility or competence of others. This assessment involves measuring consistency in performance rather than isolated successes.

For instance, someone who consistently solves technical issues earns trust over time. But even this trust is tested if they propose implausible solutions, like soaking a laptop in disinfectant! Through years of practice, individuals and even children develop subtle cues to distinguish reliable expertise from faulty advice.

Intuition also helps combat groupthink. Despite majority opinions, individuals rationally compare competency cues against their beliefs, ensuring independence in judgment.

Examples

  • Repeated successes in fixing computers build trust, unlike isolated examples.
  • Children using cues to decide whose advice to trust in learning environments.
  • Resistance to majority viewpoints due to evidence-based personal reasoning.

6. Trust evolves through costly signaling

Sending a message, truthful or not, always costs something—time, energy, or resources. This built-in cost discourages dishonesty and reinforces the stability of trust-based systems.

Actions speak louder than words when the stakes are visible. For instance, a liar risks ruining their reliability, incurring social and transactional losses. Persistent false claims erode established relationships, making the cost of deception outweigh any potential short-term gain.

In evolutionary terms, this principle helps maintain trust between individuals and groups, ensuring harmonious and cooperative relationships over time.

Examples

  • Social penalties dissuading dishonesty in close-knit communities.
  • Bees executing costly actions, like waggle dances, to maintain hive harmony.
  • Recurrent lies diminishing credibility and trust in personal or professional networks.

7. Fake news reinforces existing beliefs

Rather than persuading people to adopt new beliefs, fake news often emboldens pre-existing notions. Studies show that misinformation rarely changes opinions; instead, it validates what people already wish to believe.

Consider voter behavior during Brexit or the 2016 US elections. Widespread fake news didn’t create entirely new beliefs. Instead, it served as confirmation for individuals who already leaned toward certain preferences. This behavior emphasizes why people selectively consume information that aligns with their perspective.

Research revealed that even exposure to opposite views (through controlled assignments in a lab) frequently failed to shift overall opinions. Trust in fake news happens only when audiences are predisposed toward the belief it promotes.

Examples

  • The role of 2016 election misinformation in reinforcing—rather than creating—opinions.
  • Experiments showing limited influence of oppositional TV content in changing votes.
  • Proclivity for selective consumption of information that matches worldviews.

8. Evolution shaped communication balance

Humans didn’t start as gullible and suddenly develop resistance. Instead, communication evolved alongside reasoning abilities. Open vigilance systems emerged not to dismiss all information but to assess truth based on evidence and coherence.

As trust depends on shared goals and cognitive mechanisms, humans developed a balanced approach—a synthesis of skepticism and open-mindedness. These dual traits arm individuals against both overt gullibility and excessive mistrust.

The evolutionary perspective emphasizes how trust and evaluation co-developed, ensuring smarter decision-making over generations.

Examples

  • Parallel development of communication and skepticism throughout evolution.
  • Bees relying on evolutionary trust systems to navigate challenges collectively.
  • Absence of universal gullibility proving nuanced vigilance existed early.

9. Making better trust decisions

Building trust begins with taking informed leaps of faith. While it might feel risky, trusting allows individuals to learn, refine instincts, and foster reciprocal relationships. It’s better to err on the side of trust than to miss valuable opportunities entirely.

Each positive interaction adds to a growing understanding of who and what can be relied upon. Errors in trust usually stem from under or overestimating the wrong signals—something we can only calibrate through experience.

Giving someone the benefit of the doubt sharpens communication skills and promotes long-term social learning.

Examples

  • Trust experiments showing progression from hesitance to openness during group studies.
  • Practical trust-building moments in professional or personal contexts.
  • Cases where mistrust led to missed opportunities for mutually beneficial exchanges.

Takeaways

  1. Use plausibility checking to balance being open to new ideas while filtering potential misinformation.
  2. Build trust gradually by observing patterns of consistent behavior.
  3. Allow for reason and evidence to challenge your assumptions, enabling intellectual growth.

Books like Not Born Yesterday