Book cover of Not Born Yesterday by Hugo Mercier

Not Born Yesterday

by Hugo Mercier

13 min readRating:3.9 (386 ratings)
Genres
Buy full book on Amazon

Introduction

In today's world of fake news, misinformation, and online propaganda, it's easy to assume that humans are inherently gullible and easily manipulated. Many scholars and social commentators have argued that we're prone to blindly absorbing information and beliefs from our environment. But is this really true? In his book "Not Born Yesterday," cognitive scientist Hugo Mercier challenges this notion and presents a more nuanced view of human cognition and belief formation.

Mercier argues that humans have actually evolved sophisticated mental mechanisms that allow us to carefully evaluate information and decide what to believe. Far from being passive receptacles for ideas, we actively use reasoning and intuition to assess the credibility of both messages and messengers. While we can certainly make mistakes, we're generally quite discerning about what information we accept as true.

This book explores the cognitive tools we use to navigate a complex world of competing claims and ideas. It examines how we determine who to trust, how we evaluate arguments, and why some beliefs spread while others don't. By understanding these mental processes, we can gain insight into everything from the effectiveness of propaganda to the dynamics of fake news.

Let's dive into the key ideas from "Not Born Yesterday" to better understand the fascinating science of human belief.

We're Not as Gullible as You Might Think

A common assumption is that humans are inherently gullible and prone to believing false information, especially when we're tired, distracted, or not very intelligent. But Mercier argues this view is mistaken and lacks evidence.

He points out that historical attempts at mass persuasion, like Nazi propaganda, were actually not very effective at changing people's core beliefs. Studies show that repeated exposure to Nazi propaganda had little impact on levels of anti-Semitism. The regions most sympathetic to Nazi ideas already had high levels of anti-Semitism to begin with.

So rather than proving human gullibility, propaganda's limited effectiveness demonstrates how difficult it is to influence people's deep-seated views. We're not blank slates that simply absorb whatever ideas we're exposed to.

Mercier also critiques the "fax model" of cultural transmission proposed by some anthropologists. This model suggests we indiscriminately soak up the culture around us and pass it down through generations. While it's true that culture shapes many of our behaviors, this view underestimates the potential for variation within societies. If we blindly copied everything, how could there be so many differences between individuals in the same culture?

Instead, Mercier argues we're quite selective about what cultural information we adopt. We seek out beliefs that align with our existing views and serve our goals. We don't simply conform or follow leaders because they're charismatic. Of course, we can still make mistakes or be deceived occasionally. But overall, we're far more discerning than the "gullible humans" narrative suggests.

The Hidden Costs of Deception

To understand why humans aren't as easily fooled as we might assume, it's helpful to consider the evolutionary pressures that shaped our cognitive abilities. Mercier explains that there are actually significant costs and risks associated with deception, for both the deceiver and the deceived.

From an evolutionary perspective, individuals with shared goals (like family members) have a strong incentive to communicate honestly with each other. Their shared interest in survival and reproduction means deception would be counterproductive. This is why, for instance, honeybees accurately communicate the location of food sources to their hivemates through their waggle dance. The bees' shared genetic interests mean there's no benefit to lying.

Even between unrelated individuals, deception carries costs. Sending false signals requires energy and time. If you consistently lie to someone, they'll eventually stop trusting you, making future communication difficult or impossible. There's also the risk of being caught in a lie, which can damage your reputation and social standing.

For the person being deceived, believing false information can lead to poor decisions with potentially severe consequences. Imagine trusting someone who claims to be a doctor but isn't - this could result in following dangerous medical advice.

Given these costs and risks, humans have evolved to be naturally skeptical and to carefully evaluate the information we receive. We've developed cognitive mechanisms to detect deception and assess the reliability of both messages and messengers. This doesn't mean we're immune to being fooled, but it does mean we're not the credulous dupes that some theories of human nature assume.

Open Vigilance: Our Built-in BS Detector

So if we're not inherently gullible, how do we decide what to believe? Mercier introduces the concept of "open vigilance" - a set of cognitive mechanisms that allow us to be open to new information while remaining vigilant against deception.

Open vigilance helps explain how we navigate the constant stream of information we encounter. We're open to communication and new ideas, but we're also scrupulous about what we accept as true. This balance of openness and skepticism is crucial for effective learning and decision-making.

Mercier contrasts this view with the "arms race" theory of human vigilance. This theory suggests that humans were originally gullible and only recently developed skepticism in response to increasing attempts at deception - similar to how countries might escalate their military capabilities in response to perceived threats.

However, Mercier argues that open vigilance likely evolved alongside human communication abilities, rather than as a later addition. When our attention is compromised (e.g., when we're tired or distracted), we don't revert to a gullible state. Instead, we often become more stubborn and conservative in our beliefs - in other words, more vigilant.

This suggests that vigilance is a fundamental aspect of human cognition, not a fragile defense that easily breaks down. We're naturally inclined to critically evaluate information rather than passively absorbing it.

Plausibility Checking and Reasoning

Two key components of our open vigilance system are plausibility checking and reasoning. These cognitive tools help us evaluate the credibility of information we encounter.

Plausibility checking involves comparing new information against our existing knowledge and beliefs. When we hear a claim, we automatically assess whether it fits with what we already know about the world. This helps us quickly filter out implausible ideas without having to deeply analyze every piece of information we encounter.

For example, if someone tells you they saw a unicorn in the park, your plausibility check would likely flag this as highly unlikely based on your understanding of the world. You wouldn't need to thoroughly investigate the claim to be skeptical of it.

Reasoning, on the other hand, allows us to more deeply assess the quality of arguments and evidence presented to us. We can critically examine the logic and supporting information behind a claim to decide whether it's convincing.

These mechanisms work together to keep us both vigilant and open-minded. We can reject implausible information quickly, but we're also capable of changing our minds when presented with strong arguments and evidence.

Importantly, this process isn't just about rejecting anything that doesn't fit our preexisting beliefs. When we encounter credible information that challenges our views, we often find ways to integrate this new knowledge into our understanding of the world. This is why discussing issues in small groups can often lead to better decision-making - people have the opportunity to share ideas and collectively reason through complex topics.

Assessing Competence and Expertise

Another crucial aspect of deciding what to believe is determining who to trust as a source of information. Mercier explains that we use various cues and intuitions to assess others' competence and expertise.

One key factor is past performance. If someone has consistently demonstrated knowledge or skill in a particular area, we're more likely to trust their judgment on related matters. This is why we might trust a friend who has successfully fixed many computers to help with our tech issues, rather than turning to a random stranger.

However, we don't blindly defer to perceived experts. Even young children have been shown to consider multiple factors before deciding if someone knows better than they do. We compare others' claims against our own knowledge and intuitions. If an alleged computer expert suggested an obviously nonsensical solution (like soaking your laptop in water to fix it), you'd likely be skeptical regardless of their claimed expertise.

Interestingly, this critical evaluation applies even in cases of majority consensus. Studies have shown that people tend to think rationally and maintain their own judgments of right and wrong, even when faced with social pressure to conform to a group's opinion.

This ability to critically assess competence and expertise is crucial in navigating a complex world where we often need to rely on others' knowledge. It allows us to benefit from collective wisdom and specialized expertise without becoming overly credulous or losing our ability to think independently.

The Limited Power of Fake News

In recent years, there's been widespread concern about the influence of "fake news" and misinformation, particularly in relation to major political events like elections. Some commentators have suggested that exposure to false information online played a significant role in shaping the outcomes of events like the 2016 US presidential election and the Brexit referendum.

However, Mercier argues that the actual impact of fake news is often overstated. He points out that there's limited evidence that misinformation significantly changes people's core beliefs or voting behavior.

To understand why, it's helpful to look at research on media effects more broadly. Studies from the 1970s and 80s initially suggested that exposure to media like TV news could significantly influence people's political views. However, these lab results proved difficult to replicate in real-world settings.

A key insight came from a 2013 study by political scientists Kevin Arceneaux and Martin Johnson. Instead of forcing participants to watch specific news channels, they allowed people to choose what to watch. The results were revealing:

  1. Most participants simply tuned out and stopped paying attention to political content they weren't interested in.

  2. Those who did engage with political news tended to be people who were already knowledgeable about and interested in politics.

  3. These politically engaged viewers generally had strong pre-existing views that weren't easily swayed by brief exposure to news content.

This helps explain why fake news often doesn't have the dramatic effects some fear. Most people simply aren't paying close attention to a wide range of political content online. Those who do engage deeply with such content tend to have strong pre-existing beliefs that aren't easily changed.

Rather than misleading people or dramatically changing their views, fake news more often serves to reinforce and justify beliefs people already hold. People are more likely to seek out and share information that aligns with their existing perspectives, rather than information that challenges their views.

This doesn't mean fake news is harmless - it can certainly contribute to polarization and the spread of misinformation. But it suggests that humans aren't as vulnerable to being misled as some narratives imply. Our cognitive mechanisms of open vigilance and critical thinking provide significant protection against manipulation.

The Importance of Trust

While Mercier emphasizes our capacity for skepticism and critical thinking, he also highlights the importance of trust in human societies. Our ability to cooperate and share knowledge depends on being able to trust others to some degree.

In fact, Mercier suggests that we may be biased towards trusting slightly too much rather than too little. This is because the costs of occasionally being deceived are often outweighed by the benefits of successful cooperation and information sharing.

From this perspective, trust isn't a weakness or a sign of gullibility. Instead, it's a crucial social tool that allows us to navigate complex social environments and benefit from collective knowledge. By taking calculated risks in trusting others, we can learn to refine our judgment about who and what to trust.

This view challenges the idea that we should default to extreme skepticism in all situations. While critical thinking is important, an overly mistrustful attitude can lead to missed opportunities for learning and cooperation.

Implications and Applications

Understanding the science of human belief has important implications for various fields:

  1. Education: Recognizing students' capacity for critical thinking can inform more effective teaching methods that engage students' reasoning abilities rather than treating them as passive recipients of information.

  2. Media literacy: Instead of assuming people are easily fooled by misinformation, media literacy efforts might focus on enhancing people's existing cognitive tools for evaluating information.

  3. Public health communication: Crafting messages that respect people's ability to reason and critically evaluate claims may be more effective than simplistic or patronizing approaches.

  4. Political discourse: Recognizing that people aren't easily swayed by propaganda or fake news might shift focus towards addressing underlying reasons for political disagreements.

  5. Marketing and advertising: Understanding the limits of persuasion could lead to more ethical and effective communication strategies.

  6. Technology design: Social media platforms and other technologies might be designed to better support critical thinking and information evaluation, rather than assuming users will believe anything they see.

Final Thoughts

"Not Born Yesterday" presents a compelling case for a more nuanced understanding of human cognition and belief formation. By highlighting our evolved capacities for critical thinking and open vigilance, Mercier challenges simplistic notions of human gullibility.

This perspective doesn't deny that misinformation and manipulation are real problems in today's information landscape. However, it suggests that the solution lies not in assuming people are helpless against deception, but in recognizing and enhancing our natural cognitive defenses.

Understanding the science of belief can help us navigate a complex world of competing claims and ideas. It reminds us that while we're certainly capable of making mistakes or holding false beliefs, we're also equipped with sophisticated mental tools for discerning truth from falsehood.

Ultimately, Mercier's work encourages a more optimistic view of human rationality. Rather than despairing about our susceptibility to manipulation, we can appreciate the remarkable cognitive abilities that allow us to critically engage with the world around us. By understanding these abilities better, we can work to create environments and systems that support clear thinking and well-informed decision-making.

In a world often characterized by information overload and polarized debates, the insights from "Not Born Yesterday" offer a valuable perspective. They remind us of our capacity for reason and discernment, while also highlighting the importance of fostering environments that support critical thinking and open-minded inquiry.

As we continue to grapple with challenges like misinformation and political polarization, the ideas presented in this book provide a foundation for more nuanced and effective approaches. By recognizing both the strengths and limitations of human cognition, we can work towards creating a more informed, thoughtful, and resilient society.

Books like Not Born Yesterday