Book cover of The Language Instinct by Steven Pinker

The Language Instinct

by Steven Pinker

24 min readRating: 4.0 (21,940 ratings)
Genres
Buy full book on Amazon

Introduction

Have you ever wondered why learning your native language as a child seemed so effortless, while trying to pick up a new language as an adult can feel like an uphill battle? Or why we're able to communicate complex ideas with such ease, despite the occasional misunderstanding? Steven Pinker's groundbreaking book "The Language Instinct" delves into these questions and more, exploring the fascinating world of human language and our innate ability to master it.

In this comprehensive summary, we'll explore Pinker's key ideas about the nature of language, how it's structured, and why humans are uniquely equipped to acquire and use it. We'll dive into the neuroscience behind our linguistic skills and discover how children use grammatical rules they've never been taught. Along the way, we'll debunk some common myths about language and gain a deeper understanding of how our minds process and create speech.

So, whether you're a language enthusiast, a curious reader, or someone who's always wondered about the magic behind human communication, this summary will provide you with valuable insights into the language instinct that shapes our daily lives.

The Language Instinct: Nature's Gift to Humans

We're Born with a Language Instinct

Think about how easily you can transform the thoughts in your head into meaningful sentences. It's a remarkable skill, but where does it come from? Many people believe we learn grammar in school, but the truth is far more fascinating: our knowledge of language structure is present even before we're born!

Noam Chomsky, a famous linguist, proposed the theory of Universal Grammar, which suggests that children have an innate understanding of grammatical structure. According to Chomsky, we don't learn how to speak from our parents or teachers; instead, we use our built-in grammar skills to acquire language. This theory implies that all languages share the same basic underlying structure.

One of Chomsky's main arguments for this innate language ability is the "poverty of the stimulus." This concept demonstrates that children understand complex verb and noun structures they couldn't possibly have learned through exposure alone. For example, when turning a simple sentence like "A unicorn is in the garden" into a question, you just move the "is" to the front. But with a more complex sentence like "A unicorn that is eating a flower is in the garden," you need to move the second "is" to create a grammatically correct question.

Chomsky correctly predicted that children would never make the mistake of moving the wrong "is," even with sentences they had never heard before. Subsequent experiments confirmed this, showing that children have an innate understanding of complex grammatical rules.

Even more compelling evidence comes from studies of deaf children. Researchers observed a deaf boy named Simon, whose deaf parents only learned sign language as adults and made various grammatical mistakes. Surprisingly, Simon didn't make the same errors, despite only being exposed to his parents' imperfect signing. This suggests that Simon had an innate knowledge of grammar that prevented him from repeating his parents' mistakes.

These findings support the idea that we're all born with a language instinct – a natural ability to acquire and use language that goes far beyond what we're explicitly taught.

Debunking the Myth of Linguistic Relativity

You may have heard the popular idea that the language we speak shapes how we perceive and understand the world. This concept, known as linguistic relativity or the Whorfian Hypothesis (named after linguist Benjamin Whorf), has captured the public imagination. However, despite its appeal, there's little scientific evidence to support it.

Whorf, an amateur scholar of Native American languages, made several claims about how Native Americans viewed the world differently due to their language structure. For example, he noted that in one Apache dialect, "a dripping spring" translates literally as "whiteness moves downward." Whorf argued that this meant Apaches didn't perceive the world in terms of distinct objects or actions.

However, other linguists quickly pointed out flaws in Whorf's reasoning. For one, he never actually studied Apaches in person – it's not even clear if he ever met one! Additionally, he often translated sentences in ways that made them sound more mystical than they really were. But you could do the same with any language. For instance, the simple English phrase "he walks in" could be dramatically reworded as "as solitary masculinity, leggedness proceeds."

Another common claim related to linguistic relativity is that people see colors differently based on their native language. Some cultures have only two color words, roughly corresponding to "dark" and "light." But does this mean they actually perceive fewer colors? Of course not! It would be absurd to think that language could somehow alter the physiology of our eyes.

The persistence of belief in linguistic relativity often stems from urban myths. A prime example is the Great Eskimo Vocabulary Hoax. Many people believe that Eskimos have far more words for snow than English speakers do. In reality, experts say they have about 12 – hardly a significant difference from English's various terms like snow, sleet, slush, and hail.

By debunking these myths, Pinker emphasizes that while language is a powerful tool for communication, it doesn't fundamentally alter our perception of reality. Our shared human experiences and cognitive abilities play a much more significant role in shaping how we understand the world around us.

The Building Blocks of Language

Two Key Principles of Language

So, how do we manage to communicate with each other so effortlessly? Pinker explains that human language follows two fundamental principles that make communication possible and efficient.

The first principle is the arbitrariness of the sign, an idea introduced by Swiss linguist Ferdinand de Saussure. This concept refers to the way we pair sounds with meanings. For example, the word "dog" doesn't sound like a dog – it doesn't bark or walk on four legs. The word has no inherent "dogness," yet we all understand what it means.

Why does this work? English speakers make the same association between the sound "dog" and the animal through countless instances of learning and repetition. This arbitrariness is actually a huge benefit for language communities. It allows us to transfer ideas almost instantly without having to rationalize why a particular sound is paired with a particular meaning.

The second principle is that language makes infinite use of finite media. In simpler terms, this means we have a limited set of words that we can combine to create an endless number of sentences and expressions.

We make sense of these infinite possible combinations by establishing rules that govern how words can be arranged. Consider the difference between "dog bites man" and "man bites dog." Apart from one being an everyday occurrence and the other being newsworthy, the difference lies in the foundational grammar that governs meaning.

Each word in "dog bites man" has its own individual meaning that doesn't depend on the complete sentence. Grammar is what allows us to arrange these words in specific combinations to evoke specific images and meanings. There's a finite number of words, but grammar gives us an infinite number of ways to combine them.

These two principles – the arbitrariness of the sign and the infinite use of finite media – form the foundation of human language's flexibility and power. They allow us to communicate complex ideas efficiently and create new expressions to describe our ever-changing world.

The Fascinating World of Words and Grammar

While grammar often gets the spotlight in language discussions, words themselves are equally fascinating. Pinker delves into the structure of words and how we create meaning from their smallest parts.

Just as our bodies are made up of cells, which are composed of even smaller particles, sentences and phrases are made up of words, which in turn are built from small bits of grammatical information called morphemes. These morphemes are governed by the rules of morphology.

Let's take a hypothetical word: "wug." "Wug" is a morpheme. If we want to talk about more than one wug, we add the morpheme for pluralization – the suffix -s – to create "wugs." This demonstrates a rule for creating plurals for nouns: adding the morpheme -s.

Amazingly, we didn't learn this rule explicitly as children. Psycholinguist Jean Gleason proved this in a clever experiment. She showed preschool children a drawing and told them, "This is a wug." Then she showed them two wugs and asked, "Now we have two, so we have...?" The children all added the suffix -s, even though they had never heard the word "wugs" before. This indicates that we must have an innate ability to form plurals and that we have mental rules for generating new words.

Languages differ in how they use morphemes. English is often considered simpler than German, but the difference is primarily morphological. Some languages, like the Tanzanian language Kivunjo, have incredibly sophisticated inflectional morphology. In Kivunjo, verbs can be made up of seven prefixes and suffixes – all of which are morphemes – that change the verb's meaning. For example, the word "naikimlyiia," which means "to eat," is an elaboration of the verb "-lyi-" with additional morphemes.

In contrast, most English verbs have only four forms (e.g., quack, quacks, quacked, quacking). However, what English lacks in inflection, it makes up for with derivational morphology – the creation of new words from existing ones. For instance, by adding the suffix "-able" to the word "learn," you create a new word: learnable.

Understanding these linguistic building blocks helps us appreciate the complexity and creativity inherent in human language. It also sheds light on why we find communicating with one another so natural and effortless, despite the intricate rules and structures underlying our speech.

The Marvels of Speech and Comprehension

Speech: Our Sixth Sense

Have you ever wondered why we can put a person on the moon, but struggle to create a computer that can accurately transcribe human speech? The answer lies in the complexity of spoken language and our remarkable ability to process it.

Unlike written language, speech doesn't have clear breaks between words. When we speak, we produce a continuous stream of sound – a string of phonemes, which are the smallest units of sound that make up a morpheme. These phonemes roughly correspond to the alphabet, so if you think of all the sounds when you spell out b-a-t, each sound is a phoneme.

Each phoneme has its own unique acoustic signature. For example, the word "beat" is comprised of three sounds ("b," "ea" and "t"), each with its own distinct sound wave. So why can't we simply program a computer to recognize these sound waves and recite the word "beat" back to us?

The challenge lies in a phenomenon called coarticulation. As we speak, the sounds of each phoneme blend into each other. When you say the word "beat," the three sounds that make up the word are not distinct but are influenced by the sounds uttered before and after. This blending creates enormous variability in how words sound, making it extremely difficult for computers to accurately recognize speech.

But why are humans so good at understanding speech, despite these challenges? The answer isn't entirely clear, but we can be fairly certain that it isn't due to top-down processing – that is, moving from a general to a specific analysis.

Some researchers have suggested that we understand speech by relying on context. For example, when talking about the environment, we might expect someone to say "species" instead of "special." However, given the speed of normal conversation, this seems unlikely. In most cases, it's impossible for us to predict which word our conversation partner will say next. Moreover, if you call a friend and recite ten random words from the dictionary, they'll understand them all despite the lack of context.

Our ability to understand speech seems to be more like a sixth sense – a specialized skill that our brains have evolved to perform with remarkable accuracy and speed. This innate capability allows us to navigate the complexities of spoken language with ease, even when computers struggle to keep up.

The Art of Parsing: How We Understand Written Language

While spoken language presents its own set of challenges, written language requires a different set of skills to comprehend. How do we make sense of the strange symbols written on a page and transform them into meaningful ideas in our minds?

The key to understanding written language lies in our ability to parse sentences. Parsing involves breaking down sentences into their component parts and identifying their grammatical roles to understand their meaning. In essence, we're highly skilled "parsers" when it comes to written language.

Grammar itself is nothing more than the code for how language works, specifying which sounds correspond to which meanings. Our minds then parse this grammatical information, looking for the subject, verb, objects, and so forth, and group them together to provide the meaning of the sentence.

Linguists believe that there are two main types of parsing: breadth-first search and depth-first search.

A breadth-first search is a style of parsing that looks at individual words to determine a sentence's meaning. During this analysis, the brain will briefly entertain multiple (and sometimes absurd) meanings for ambiguous words. For example, when encountering the word "bug," the brain might momentarily consider both its meaning as an insect and as a spy device.

A depth-first search, on the other hand, looks at entire sentences. This approach is necessary when there are simply too many words to compute individually. In this case, the brain picks one likely meaning for the sentence and runs with it.

Sometimes, depth-first searches can lead to confusion, especially with what are known as "garden path" sentences. These sentences demonstrate how our parsers can not only fail to choose a likely meaning for a sentence but also stubbornly hold onto the wrong interpretation.

Consider this sentence: "The man who hunts ducks out on weekends." Despite being grammatically correct, it confuses most people because the meaning changes halfway through. Our brains initially interpret "ducks" as the object of hunting, but then have to quickly readjust when we realize "ducks out" means "avoids responsibility." This sudden shift in meaning can leave us momentarily bewildered.

Our ability to parse written language so efficiently is a testament to the power of our language instinct. It allows us to quickly make sense of complex ideas presented in written form, even when the sentences are ambiguous or tricky. This skill is crucial for everything from casual reading to academic study and professional communication.

The Origins and Development of Language

The Critical Period: Childhood and Language Acquisition

We've established that we're all born with an innate ability to acquire language. However, this innate capacity needs the right environment and timing to fully develop. Childhood represents a critical period for honing our language skills.

Young children are essentially language sponges, absorbing words at an astonishing rate. Pinker estimates that an average six-year-old has a vocabulary of around 13,000 words! This is particularly impressive when you consider that preliterate children only hear words through speech and have no opportunity to study them in written form. Instead, they memorize a new word approximately every two hours of their waking life, day after day.

This feat is even more remarkable when you consider that the most effective methods for memorization, such as mnemonic devices, don't work well for individual words. A mnemonic is a learning technique that transforms what we want to remember into something more memorable. For example, to learn the lines on a musical staff (EGBDF), you might use the sentence "Every Good Boy Deserves Fudge." But this approach doesn't work for remembering individual words. Given the lack of easy ways to remember words, children's brains must have an innate, powerful system for quickly mastering language.

However, as we grow older, we begin to lose this amazing ability. Adults often struggle when learning a new language, as the skill seems to diminish with age. Psychologist Elissa Newport conducted a study on immigrants to America that illustrates this point. She found that those who had arrived between the ages of three and seven were as skilled in English grammar as native-born speakers. However, those who immigrated between eight and 15 fared much worse.

The same principle applies when learning our first language. Throughout history, there have been rare cases of children who grew up without human contact, usually due to severe neglect. These "wolf children," as they're sometimes called, provide stark evidence of the critical period for language acquisition. One famous case is "Genie," a 13-year-old girl discovered in 1970 who had grown up in isolation. Because she lacked human contact during her crucial developmental years, she was unable to form even basic grammatical sentences.

These findings underscore the importance of early exposure to language. While our brains are hardwired for language acquisition, this ability is most potent during our childhood years. This explains why children can pick up languages so easily, while adults often find it challenging. It also highlights the critical role that early language exposure plays in a child's cognitive and social development.

The Evolutionary Origins of Language

Now that we've explored how language develops in individuals, let's consider a broader question: Where did our language instinct come from in the first place? Could it be possible that our natural ability for language was part of the evolutionary process?

This question has sparked debate among linguists and evolutionary biologists. Some, including Noam Chomsky, have expressed doubts about the compatibility of the language instinct with Darwinian evolution. However, Pinker argues that our language ability could indeed have evolved through natural selection.

The modern understanding of Charles Darwin's theory of evolution is that complex biological systems are created by the gradual accumulation of random genetic mutations over generations. These mutations enhance the organism's reproductive success, allowing it to pass on its beneficial genes.

Traditionally, there have been two main arguments against language instinct as a product of evolution:

  1. Language is unnecessarily powerful and complex. Critics argue that such a sophisticated system wouldn't have provided enough of an advantage in reproductive success to evolve.

  2. Language is unique to humans. Our closest relatives, chimpanzees, don't have language. Since chimps and humans evolved from a common ancestor, shouldn't chimps and other primates also have languages like ours?

Pinker counters these arguments effectively. Regarding the first point, he notes that this critique is akin to saying a cheetah is faster than it "needs" to be. Over time, small advantages accumulate into significant changes. Even a one-percent reproductive advantage in a particular trait could, over thousands of generations, lead to dramatic evolutionary changes.

As for the second argument, Pinker reminds us that evolution isn't a linear hierarchy where all organisms stem from the same source. Instead, evolution is more like a bush, with different branches developing unique traits. Chimpanzees and humans evolved from a common ancestor that is now extinct, so it's entirely possible for humans to have developed language without chimps ever having to have it.

Pinker suggests that our language instinct likely came about through natural selection. Our ancestors probably benefited in some way from an ability to communicate with each other, which gave them an adaptive advantage necessary for surviving in their environment. Over time, this ability became more sophisticated, eventually developing into the complex language systems we use today.

This evolutionary perspective helps explain why language is so deeply ingrained in human nature. It's not just a cultural invention or a learned skill, but a fundamental part of our biological makeup. This view also aligns with the observation that all human societies, no matter how isolated or technologically primitive, have fully developed languages.

Understanding language as an evolved trait also sheds light on its universal features. While languages differ greatly in their specific words and grammatical rules, they all share certain basic properties, such as the ability to express complex ideas, use abstract concepts, and generate an infinite number of sentences from a finite set of elements. These commonalities suggest a shared evolutionary origin for all human languages.

Implications and Applications

The Myth of "Correct" Grammar

In recent decades, there's been a growing obsession with grammatical rules. Self-proclaimed "grammar Nazis" are quick to point out errors like confusing "their" and "there," or using split infinitives. But Pinker argues that this focus on "correct" grammar is often misguided and based on arbitrary rules rather than the actual nature of language.

There's a significant difference between how we're "supposed" to talk and how we actually talk. Linguists and language scientists distinguish between two types of grammar rules:

  1. Prescriptive rules: These are the rules we learn in school and struggle with in formal writing. They govern how we're "supposed" to speak and write according to certain standards.

  2. Descriptive rules: These are the rules that linguists study, which describe how people actually use language in real life.

Scientists are more concerned with descriptive rules because prescriptive rules alone are not enough to build a language. For example, the prescriptive rule that you shouldn't start a sentence with "because" wouldn't make sense without the descriptive rules that define what a sentence is, categorize "because" as a conjunction, and explain how conjunctions function in sentences.

In the best light, prescriptive rules can be seen as refinements or decorations of the more fundamental descriptive rules. It's entirely possible to speak grammatically in a descriptive sense while breaking prescriptive rules. This is similar to how a taxi can obey the laws of physics while simultaneously breaking traffic laws.

So who decides what constitutes "correct" English? The answer is not straightforward. Prescriptive rules often come and go with changes in fads and societal norms. For instance, the rule against splitting infinitives (not putting words between "to" and a verb) that many of us were taught in school has its roots in 18th-century England. At that time, there was a movement to elevate the status of English compared to Latin. Since Latin infinitives are single words and can't be split, grammarians decided that English infinitives shouldn't be split either.

But this rule often leads to awkward constructions and doesn't reflect how people naturally speak. The famous Star Trek phrase "to boldly go where no one has gone before" is a split infinitive, yet it sounds far more natural and powerful than the technically "correct" version: "to go boldly where no one has gone before."

Pinker's point is not that grammar rules are entirely meaningless, but rather that many of the rules we obsess over are arbitrary and don't reflect the true nature of language. Instead of fixating on these prescriptive rules, we should appreciate the incredible complexity and flexibility of natural language use.

This perspective can be liberating for language learners and writers. While it's important to understand standard grammar for formal contexts, it's equally important to recognize that language is a living, evolving system. The most effective communication often comes from understanding your audience and context, rather than rigidly adhering to every prescriptive rule.

Language and the Brain: New Frontiers in Neuroscience

Our understanding of language as an instinct offers fascinating insights into how the brain is structured and functions. Recent advances in neuroscience, combined with our knowledge of language as an innate ability, are helping to unlock some of the mysteries of the human brain.

Key areas of the brain have now been identified as being associated with language processing. For instance, the left perisylvian area is now considered to be the brain's "language organ." In 98 percent of cases where brain damage results in language impairment, the left perisylvian area is affected. This suggests a strong localization of language functions in the brain.

While the relationship between brain structure and function is complex and not yet fully understood, it does appear that certain cognitive faculties are housed in specific areas of the brain, called modules. Different aspects of language, such as speech production, comprehension, and grammar processing, all involve areas of the brain that are located close to one another in the left hemisphere.

This modular view of brain function aligns well with the idea of language as an instinct. Just as we have specialized brain regions for processing visual information or controlling motor functions, we seem to have specialized areas for language. This specialization likely contributed to our species' remarkable linguistic abilities.

Our knowledge of the language instinct also allows us to speculate about other potential hardwired instincts we might have. For example, anthropologist Brent Berlin has proposed the idea of an innate "folk biology." This suggests that humans have an intuitive understanding that plants and animals belong to different species or groups, even without formal education in biology.

Psychologist Elizabeth Spelke demonstrated the plausibility of folk biology in an experiment with children. She showed children a picture of a raccoon that transformed to look like a skunk, and then a coffee pot that transformed into a bird feeder. The children readily accepted the coffee pot's transformation but couldn't accept that a raccoon had turned into a skunk. This suggests an intuitive understanding of the difference between natural kinds (like animals) and artificial objects.

These findings have significant implications for our understanding of human cognition and learning. They suggest that our brains come pre-equipped with certain ways of organizing and understanding the world around us. This innate knowledge forms the foundation upon which we build more complex understanding through experience and education.

Moreover, the study of language and the brain is opening up new avenues for treating language disorders and improving language learning. By understanding how the brain processes language, researchers are developing more effective therapies for conditions like aphasia (language impairment due to brain damage) and more efficient methods for teaching second languages.

The intersection of linguistics and neuroscience is a rapidly evolving field, and new discoveries are constantly reshaping our understanding of how language works in the brain. As we continue to unravel these mysteries, we gain not only a deeper appreciation for the complexity of human language but also valuable insights into the nature of human cognition itself.

Conclusion

Steven Pinker's "The Language Instinct" offers a compelling exploration of the nature of human language and our innate ability to acquire and use it. Through a blend of linguistic theory, cognitive science, and evolutionary biology, Pinker presents a comprehensive view of language as a fundamental aspect of human nature.

Key takeaways from the book include:

  1. Language is an instinct: We are born with an innate capacity for language, which explains why children can acquire complex grammatical structures without explicit instruction.

  2. Universal Grammar: All languages share a common underlying structure, supporting the idea of a genetically determined language faculty.

  3. The arbitrariness of language: The connection between words and their meanings is largely arbitrary, allowing for efficient communication within language communities.

  4. Infinite use of finite media: Language allows us to create an infinite number of expressions from a finite set of elements.

  5. The critical period for language acquisition: Childhood represents a crucial window for language development, explaining why children learn languages more easily than adults.

  6. Language and evolution: Our language ability likely evolved through natural selection, providing our ancestors with adaptive advantages.

  7. The brain and language: Specific areas of the brain are dedicated to language processing, supporting the idea of language as a specialized cognitive function.

  8. Prescriptive vs. descriptive grammar: Many of the grammar rules we obsess over are arbitrary and don't reflect how language naturally functions.

Pinker's work challenges us to reconsider many common assumptions about language. It reminds us that language is not just a cultural invention or a set of rules to be memorized, but a fundamental part of what makes us human. Understanding language as an instinct helps explain its universality, its complexity, and the ease with which children acquire it.

Moreover, this perspective on language has far-reaching implications. It informs our approaches to education, particularly language teaching. It provides insights into cognitive development and brain function. And it offers a new lens through which to view human evolution and the development of our species' unique cognitive abilities.

As we continue to unravel the mysteries of language and the brain, we gain not only a deeper understanding of how we communicate but also valuable insights into the nature of human cognition itself. The language instinct, as Pinker describes it, is a window into the remarkable capabilities of the human mind and a testament to the complex biological heritage that shapes our ability to think, communicate, and understand the world around us.

Books like The Language Instinct