Introduction
In our modern world, we are surrounded by an ever-increasing amount of technology and data. The internet, social media, and advanced computing systems were once hailed as the harbingers of a new era of enlightenment and progress. However, as James Bridle argues in his thought-provoking book "New Dark Age," these technologies may actually be plunging us into a new dark age – a time where we have access to more information than ever before, yet seem to understand less and less about the world around us.
Bridle takes readers on a journey through the complex and often unseen ways that technology shapes our lives, our societies, and even our planet. From the military origins of modern computing to the environmental impact of our digital infrastructure, from the rise of conspiracy theories to the disturbing world of algorithmic children's entertainment, "New Dark Age" lays bare the unexpected consequences of our technological advancement.
This book challenges us to think critically about the tools and systems that have become an integral part of our daily lives. It urges us to examine where these technologies came from, how they function, and who they truly serve. By doing so, Bridle hopes to equip readers with the knowledge and perspective needed to navigate our increasingly complex and confusing world.
As we delve into the key ideas presented in "New Dark Age," we'll explore how computation, big data, artificial intelligence, and other technological innovations are affecting our understanding of the world – and why it's crucial that we learn to see beyond the surface of these systems to grasp their true impact on our lives and our future.
The Military Origins of Modern Computation
One of the most surprising revelations in "New Dark Age" is the deep connection between modern computing and military endeavors, particularly attempts to control the weather. This link dates back to World War I, when mathematician Lewis Fry Richardson first attempted to make calculations on atmospheric conditions to predict the weather while serving on the Western Front.
Richardson's work led him to envision what could be considered the first conceptual "computer" – a vast hall filled with thousands of human mathematicians, each calculating weather conditions for a specific area of the world and communicating their results to make further calculations. This idea, while seemingly far-fetched at the time, laid the groundwork for future developments in computational weather prediction.
The real breakthrough in machine computation came during World War II, driven by massive military research spending. Projects like the Manhattan Project, which led to the creation of the atomic bomb, were closely tied to the development of the first computers. These early machines, such as the Electronic Numerical Integrator and Computer (ENIAC) from 1946, were primarily used to perform automated calculations simulating the impact of bombs and missiles under various weather conditions.
Interestingly, the military origins and purposes of these early computers were often concealed from the public. Bridle gives the example of IBM's Selective Sequence Electronic Calculator (SSEC), which was installed in a New York shop window in 1948. While the public was told it was calculating astronomical positions, it was actually working on a secret program called Hippo, simulating hydrogen bomb explosions.
This history reveals how, from the very beginning, the complex and hidden workings of computers provided a convenient way to obscure their actual functions. It also highlights the longstanding relationship between technological advancement and military objectives – a relationship that continues to shape our digital world today.
However, these early computers were far from perfect. Bridle recounts numerous anecdotes illustrating how computers' oversimplified view of the world, their inability to distinguish between reality and simulation, and bad data could lead to serious consequences. One famous example is the US computer network SAGE, used during the Cold War to integrate atmospheric and military data. SAGE was known for its near-fatal errors, such as mistaking a flock of migrating birds for an incoming Soviet bomber fleet.
These historical insights serve to remind us that while computers have come a long way since their early days, they are still prone to errors and misinterpretations. As we rely more and more on computational systems to understand and interact with the world, it's crucial to remember their limitations and the potential consequences of blindly trusting their outputs.
The Inextricable Link Between Technology and Climate Change
In "New Dark Age," Bridle explores the complex and often overlooked relationship between new technologies and climate change. He introduces the concept of climate change as a "hyperobject" – something so vast and pervasive that we can't fully comprehend it, but can only witness its effects on the world around us.
One striking example of these effects is the Syrian conflict, which some observers have described as the first climate war in history. Bridle explains how rising global temperatures led to unprecedented droughts in the Syrian countryside between 2006 and 2011. This environmental crisis resulted in massive crop failures and livestock deaths, forcing farmers to flee to cities. The resulting demographic pressure and resentment towards the government's handling of the situation ultimately contributed to the armed conflict that later became known to the West as the refugee crisis.
But it's not just traditional sectors like agriculture that are affected by changing weather patterns. Bridle points out that new technologies, including the internet, are also vulnerable to climate change. While we often think of the World Wide Web as an intangible "cloud," it relies on a vast physical infrastructure of fiber-optic cables, antennas, and servers. This infrastructure is highly susceptible to extreme weather conditions. For instance, WiFi strength and effectiveness decrease with higher temperatures, and many computational devices fail completely in extreme heat.
The relationship between digital technologies and climate change is bidirectional. Not only are our technologies affected by changing climate conditions, but they also contribute significantly to the climate crisis. Bridle highlights that the world's physical data centers alone use about 3 percent of the world's electricity, accounting for about 2 percent of global carbon emissions. As our digital culture becomes increasingly data-hungry, these energy requirements are expected to skyrocket. For context, Bridle notes that streaming just one hour of Netflix a week consumes more electricity annually than two new refrigerators.
Perhaps most alarmingly, Bridle suggests that climate change itself might impair our ability to understand and address the crisis. As atmospheric carbon levels rise, human cognitive ability is known to decrease. At 1000 parts per million of CO2 – a level regularly exceeded in some indoor urban areas – human cognitive ability drops by 21 percent. This creates a frightening feedback loop: as we produce more CO2, our ability to comprehend and solve the problem diminishes.
These insights underscore the urgent need to consider the environmental impact of our digital lives. While technology often promises to solve our problems, Bridle's analysis shows that it can also exacerbate them in unexpected ways. As we continue to develop and rely on new technologies, we must also find ways to mitigate their environmental impact and prepare for the ways in which a changing climate will affect our digital infrastructure.
The Big Data Fallacy and the Crisis in Scientific Research
Bridle challenges the widespread belief that more data and faster computation automatically lead to better understanding and progress. He introduces the concept of Moore's law – the observation that the raw computing power of our devices doubles approximately every two years – and how it has fueled a kind of "computational optimism" in many fields, particularly scientific research.
This optimism has led to a research system that prioritizes automated testing and massive data generation over more traditional, human-centered empirical methods. Bridle uses the example of drug research to illustrate this shift. In many pharmaceutical companies, the role of human scientists has been reduced to programming and overseeing machines engaged in High-Throughput Screening – a process where computers test the effects and interactions of thousands of chemical compounds daily, hoping to stumble upon useful combinations for treating diseases.
However, Bridle argues that this approach is not yielding the expected results. He introduces "Eroom's law" (Moore's law spelled backward), which observes that every nine years since the 1960s, the number of new drugs approved for human use per billion US dollars spent on research and development has halved. This suggests that despite the increase in data and computational power, drug discovery is becoming less efficient, not more.
This problem isn't limited to pharmaceutical research. Bridle points out that across all scientific fields, while the number of studies, journals, and papers has been steadily increasing, so has the number of mistakes, instances of plagiarism, and cases of fraud. He discusses the "replication crisis" in modern science, where many scientific studies fail to produce the same results when conducted by different researchers.
To illustrate this point, Bridle mentions a 2011 study by the University of Virginia that attempted to replicate five landmark cancer studies. Of these, only two experiments could be successfully reproduced, two were inconclusive, and one failed completely. This lack of reproducibility undermines the credibility and reliability of scientific research, despite the vast amounts of data being collected and analyzed.
Bridle argues that instead of enhancing our understanding of the world, the current overflow of information is actually hindering our ability to process and make sense of what's happening around us. The sheer volume of data being generated and the speed at which it's being processed can create a false sense of progress, masking the fact that genuine scientific discovery and understanding may be slowing down.
This critique of the "big data fallacy" serves as a cautionary tale about the limits of computational approaches to complex problems. It suggests that while data and computation are powerful tools, they cannot replace human insight, creativity, and critical thinking in the scientific process. Bridle's analysis calls for a more balanced approach to research, one that combines the power of computational methods with the nuanced understanding that comes from human expertise and interpretation.
Technology as a Tool of Capitalism and Driver of Inequality
In "New Dark Age," Bridle explores how technology, far from being the great equalizer it's often portrayed as, can actually exacerbate existing inequalities and concentrate power in the hands of a few. He uses vivid examples to illustrate how the digital world, despite its apparent intangibility, has very real and often hidden physical infrastructure that plays a crucial role in global finance and commerce.
Bridle takes us to Slough, a seemingly unremarkable town outside London, which houses some of the most important parts of our digital world in vast, anonymous warehouses. One such warehouse, known as LD4, contains the data servers of the London Stock Exchange. The fiber optic cables connecting LD4 to other financial centers around the world have given rise to high-frequency trading, a form of ultra-fast, algorithm-driven financial transactions.
This system allows traders to react almost instantly to market changes, using complex algorithms and bots to monitor prices, make mock offers, and even interpret news headlines to predict economic effects. However, Bridle points out that even financial insiders struggle to keep up with the logic of their computers in this hyper-accelerated world. He cites the example of the "flash crash" on May 10, 2010, when the Dow Jones experienced an unprecedented 600-point crash and recovery within minutes, a event that no human could fully explain or control.
Beyond finance, Bridle examines how technology is changing the nature of work, often to the detriment of workers. He uses Amazon as an example, describing how the company uses robots for many warehouse tasks and treats its human "pickers" essentially like robots. These workers are guided and monitored via handheld devices that maximize efficiency and minimize human interaction, turning the workplace into a dehumanizing environment.
Bridle argues that politicians and companies are providing little perspective on what social security systems could replace full-time employment as more jobs become automated. This lack of foresight and planning threatens to leave many workers vulnerable and further widen the gap between the wealthy and the poor.
Through these examples, Bridle illustrates how technology, when driven by capitalist imperatives, tends to concentrate wealth and power rather than distribute it more evenly. The speed and complexity of technological systems like high-frequency trading create a world that's increasingly difficult for humans to understand or control, while automation threatens to eliminate jobs without clear alternatives for displaced workers.
This analysis challenges the common narrative that technological progress inevitably leads to societal improvement. Instead, Bridle suggests that without careful consideration and regulation, our advancing technologies may simply reinforce and amplify existing power structures and inequalities. His insights call for a more critical examination of how we develop and implement new technologies, and a greater focus on ensuring that technological progress benefits society as a whole, not just a privileged few.
The Bias in Machine Learning and AI
Bridle delves into the complex world of machine learning and artificial intelligence (AI), challenging the common perception that these technologies are inherently objective or unbiased. He uses several compelling examples to illustrate how AI systems can inadvertently encode and perpetuate human biases, often with troubling consequences.
The author begins with an anecdote about a US Army experiment to train an AI to recognize camouflaged tanks in forests. The AI seemed to perform perfectly in tests but failed completely in real-world applications. It was later discovered that the AI hadn't learned to identify tanks at all – it had simply learned to distinguish between sunny and cloudy days, as all the training photos with tanks were taken on sunny days, and all those without tanks on cloudy days.
This story serves to illustrate a crucial point: when we train machines to think, they don't necessarily think like us. The way AI systems process information and reach conclusions can be fundamentally different from human reasoning, and often incomprehensible to us. This opacity can be problematic, especially when these systems are used to make important decisions.
Bridle then discusses a controversial 2016 study where researchers claimed to have developed software that could distinguish between criminal and non-criminal faces. When criticized for potential bias, the researchers defended their work by claiming that machine learning is inherently "free of bias." Bridle argues that this belief in the objectivity of algorithms is dangerously misguided.
The author explains that machine learning systems are trained on data, and the only data we have is from our past. Since our history is rife with violence, injustice, and racism, machines trained on this data inevitably replicate these biases and project them into the future. He gives the example of Asian-Americans struggling to take family photos with a Nikon camera that repeatedly displayed the error message "Did someone blink?" – a clear case of racial bias encoded into a seemingly neutral technology.
Bridle's analysis reveals how machine learning and AI, far from being objective arbiters of truth, can actually reinforce and amplify existing societal biases. This is particularly concerning as these technologies are increasingly used in high-stakes decision-making processes, from criminal justice to loan approvals to hiring decisions.
Moreover, the author points out that the complexity and opacity of these systems can make it difficult to identify and correct these biases. When an AI reaches a conclusion, it's often impossible for humans to understand the exact reasoning process, making it challenging to spot and address unfair or discriminatory outcomes.
Bridle's exploration of bias in machine learning serves as a wake-up call for both developers and users of AI technologies. It underscores the need for diverse teams in AI development, rigorous testing for bias, and ongoing monitoring of AI systems in real-world applications. It also highlights the importance of maintaining human oversight and the ability to challenge and override AI decisions when necessary.
Ultimately, Bridle's analysis reminds us that while AI and machine learning offer powerful capabilities, they are not infallible or inherently fair. As these technologies become more prevalent in our lives, it's crucial that we approach them with a critical eye, always questioning their outputs and striving to ensure they serve to reduce, rather than reinforce, societal inequalities and biases.
Government Control of Technology and Data
Bridle sheds light on the often-hidden ways in which governments and intelligence agencies exert control over technology and data. He traces this phenomenon back to World War II, revealing how intelligence agencies have long been at the forefront of technological development, often keeping their innovations secret for decades.
The author gives the example of the CIA's development of the first drones, years before they became a staple in modern warfare. This illustrates how cutting-edge technologies are often developed in secret, with their existence or true purpose only becoming known to the public long after the fact – if ever.
But it's not just futuristic technologies that are kept under wraps. Bridle points out that huge chunks of our history are progressively disappearing into secret vaults. He notes that the US government classifies about 400,000 new documents as top secret every year, a number that's steadily rising. This increasing secrecy extends beyond the US. Bridle cites the shocking revelation in 2011 that around 1.2 million documents on British concentration camps in Kenya were locked away in a secret government facility. Many of these documents were "destruction certificates" attesting to an even larger number of missing records, effectively erasing a significant part of history.
This concealment of historical documents, Bridle argues, prevents nations from fully reckoning with their past. In the case of the Kenyan concentration camps, the British government's suppression of important records has hindered a proper accounting of the country's colonial legacy.
Beyond controlling historical narratives, Bridle explores how intelligence agencies exert control through mass data collection. He discusses the extensive system of mass surveillance by the NSA, which came to light with Edward Snowden's revelations in 2013. Similar programs were subsequently uncovered in other major countries across Europe and the Americas.
Interestingly, Bridle notes that public outrage over these revelations cooled quickly. The Freedom Act passed by the United States in 2015 as a response left the NSA's surveillance rights largely intact. Bridle suggests that, like climate change, mass surveillance seems too vast and complex a threat for many people to grapple with effectively.
This section of the book serves as a stark reminder of the power dynamics at play in our technological landscape. Governments and intelligence agencies, through their control of both historical records and current data, shape our understanding of the past and present. This control extends to the development of new technologies, many of which are created in secret for purposes unknown to the general public.
Bridle's analysis challenges us to think critically about the information we receive and the technologies we use. It raises important questions about transparency, privacy, and the balance of power between governments and citizens in the digital age. As our lives become increasingly intertwined with technology, understanding these dynamics becomes crucial for maintaining individual freedom and democratic principles.
The Appeal of Conspiracy Theories in a Complex World
In "New Dark Age," Bridle explores the phenomenon of conspiracy theories, examining why they've become so prevalent in our information-saturated era. He argues that these theories provide simple narratives that help people make sense of an increasingly complex and confusing world.
Bridle begins by acknowledging that humans have always been inclined to spin complex events into simple stories to understand the world around them. However, he suggests that in our current networked, information-rich environment, many of these narratives seem further from the truth than ever before.
The author uses the example of "chemtrails" – one of the oldest and most pervasive conspiracy theories of the internet age. Proponents of this theory believe that the condensation trails left by aircraft are actually chemical or biological agents deliberately sprayed for sinister purposes like population control or weather modification. Bridle points out that while human-made chemical clouds (in the form of exhaust fumes and condensation) are indeed real, chemtrail theorists literalize their general anxiety about environmental issues into a neat, albeit unfounded, theory of governmental mind control.
Another example Bridle discusses is the phenomenon of "gang stalking," where individuals believe they're being surveilled and harassed by organized groups. Given what we now know about mass surveillance programs, this perception isn't entirely divorced from reality. However, conspiracy theorists tend to personalize and simplify these complex systems into clear-cut, black-and-white narratives centered around themselves.
Bridle argues that the internet's echo chamber effect contributes significantly to the spread of these theories. Online communities provide supportive environments where like-minded individuals reinforce each other's beliefs, often leading to increasingly extreme views.
The author also points out how right-wing populists and religious fundamentalists exploit people's desire for simple explanations in complex times. He cites examples of Donald Trump tweeting about climate change being a Chinese conspiracy against American business, and how many of Trump's campaign promises seemed inspired by prominent online conspiracy theorists.
Bridle's analysis suggests that conspiracy theories, while often factually incorrect, serve an important psychological function. They provide a sense of order and understanding in a world that often feels chaotic and incomprehensible. In an age where we're constantly bombarded with information, much of it contradictory or confusing, conspiracy theories offer clear villains, straightforward causes, and simple solutions.
However, Bridle warns that while these theories might provide comfort, they can be just as frightening as the complex realities they seek to explain. Moreover, they can lead people to ignore real, pressing issues in favor of imagined threats.
This exploration of conspiracy theories serves as a reminder of the challenges we face in the information age. It highlights the need for critical thinking skills, media literacy, and the ability to navigate complex information landscapes. Bridle's analysis suggests that addressing the appeal of conspiracy theories isn't just about debunking false claims, but about helping people develop the tools to grapple with complexity and uncertainty in more productive ways.
The Dark Side of Algorithmic Content Creation
In one of the most unsettling sections of "New Dark Age," Bridle delves into the world of algorithmic content creation, particularly focusing on its impact on children's entertainment. He reveals how the combination of financial incentives and algorithms has led to the production of disturbing and potentially harmful content, especially on platforms like YouTube.
Bridle begins by explaining the financial motivation behind content creation on YouTube. Successful videos can generate significant revenue from advertising, with viral hits like "Gangnam Style" earning millions of dollars. This potential for profit has led to an influx of content creators, including both human vloggers and, increasingly, automated bots.
The author points out that children's entertainment has become a particularly lucrative sector on YouTube. With young children spending more time online and often watching videos repeatedly, they're an easy target for content creators looking to maximize views and, consequently, ad revenue.
Bridle describes how many of these so-called children's videos are actually produced by bots created by companies seeking quick profits. He gives the example of the YouTube channel "Little Baby Bum," which has churned out thousands of bot-created animated sing-along videos, all following similar patterns and melodies.
However, it's not just the mass-produced nature of this content that's concerning. Bridle reveals how content creators often use algorithms to game YouTube's recommendation system, resulting in nonsensical video titles stuffed with popular keywords. He provides an example of such a title: "150 Giant Surprise Eggs Kinder CARS StarWars Marvel Avengers LEGO Disney Pixar Nickelodeon Peppa."
Even more alarming is the content of some of these videos. Bridle describes videos where familiar characters are placed in disturbing situations, such as one where the heads of characters from Aladdin float around the screen, attaching to different bodies. When a mismatch occurs, a character lets out an automated wail.
Perhaps most troubling is Bridle's observation that YouTube's algorithms can't distinguish between genuine children's content and parody or adult-oriented content featuring children's characters. He gives the example of a video where the beloved character Peppa Pig is shown being tortured at the dentist, having all her teeth violently removed.
Bridle emphasizes that while many of these videos aren't specifically targeted at children, YouTube's lack of effective content control means they inevitably reach young viewers. This creates a situation where children are exposed to potentially traumatizing content disguised as entertainment.
This section of the book serves as a stark warning about the unintended consequences of algorithmic content creation and distribution. It highlights how the pursuit of profit, combined with the scalability of automated systems, can lead to the mass production of low-quality, and sometimes harmful, content.
Bridle's analysis calls attention to the need for better regulation and moderation of online content, especially on platforms frequented by children. It also underscores the importance of parental oversight and media literacy education to help children navigate the often bewildering landscape of online content.
More broadly, this exploration of algorithmic content creation serves as a cautionary tale about the potential dangers of letting algorithms dictate our cultural products. It reminds us of the importance of human judgment and ethical considerations in content creation and distribution, especially when it comes to vulnerable audiences like children.
Embracing Complexity in the New Dark Age
In the concluding sections of "New Dark Age," Bridle offers his thoughts on how we can navigate the challenges of our increasingly complex technological world. He argues for a shift away from what he calls "computational optimism" – the belief that more data and faster processing will automatically solve our problems – towards a more nuanced understanding of technology's role in our lives.
Bridle begins by critiquing the tech industry's tendency to oversimplify complex issues. He cites an example from Google's 2013 Zeitgeist conference, where CEO Eric Schmidt claimed that if camera phones had existed in 1994, the Rwandan genocide would never have happened because people could have filmed and shared news about the atrocities. Bridle points out the flaw in this thinking: investigations have shown that several NGOs, foreign embassies, and the UN were closely monitoring the situation in Rwanda. The problem wasn't a lack of information, but a lack of action.
This example illustrates a broader point: contrary to tech optimists' beliefs, making something visible doesn't automatically fix it. Our current information overload often leads to apathy and inaction rather than understanding and positive change. Bridle argues that rather than helping us make sense of the world, the deluge of data and computation has often made things more complicated and confusing.
To navigate this new dark age, Bridle suggests we need to develop a more critical and conscious approach to technology. He references British mathematician Clive Humby's phrase "data is the new oil," but emphasizes the often-overlooked part of Humby's statement: like oil, data needs to be refined to be useful. Simply collecting more data isn't enough; we need to learn how to analyze and interpret it effectively.
Bridle advocates for a shift in focus. Instead of trying to collect ever more data to predict increasingly complex events, we should learn to think critically about where our data comes from, what it's being used for, and who owns it. He emphasizes the importance of examining the global technological networks that produce and use this data, and considering ways we can change them for the better.
This approach, Bridle argues, is crucial for bringing meaning to our technologically saturated world. By understanding the systems that shape our lives, we can begin to use them more effectively and ethically, rather than being passively shaped by them.
Bridle's conclusion serves as a call to action. He urges readers to embrace the complexity of our world rather than seeking oversimplified explanations or solutions. This means developing a more nuanced understanding of technology – recognizing both its potential benefits and its limitations, and being willing to question its applications and implications.
Ultimately, Bridle suggests that navigating the new dark age requires a combination of technological literacy and critical thinking. We need to understand how our technologies work, but also be able to step back and consider their broader impacts on our society, our environment, and our ways of thinking. Only by doing so can we hope to use technology in ways that truly benefit humanity and help us address the complex challenges we face.
Final Thoughts
James Bridle's "New Dark Age" presents a compelling and often unsettling exploration of how technology is shaping our world in ways we don't always see or understand. Through a series of interconnected essays, Bridle challenges many of our assumptions about technological progress and its impacts on society, the environment, and our own thinking.
The book's key message is that while we have access to more information than ever before, we seem to understand less about the world around us. This paradox is at the heart of what Bridle calls the "new dark age" – a time of technological advancement coupled with increasing confusion and uncertainty.
Bridle's work touches on a wide range of topics, from the military origins of modern computing to the environmental impact of our digital infrastructure, from the rise of conspiracy theories to the disturbing world of algorithmic children's entertainment. Through these diverse examples, he illustrates how technology often has unintended consequences that can be difficult to predict or control.
One of the book's strengths is its ability to make visible the hidden systems and structures that underpin our digital world. Bridle reveals the physical infrastructure behind our seemingly intangible online experiences, the secretive government programs that shape our technological landscape, and the complex algorithms that increasingly influence our daily lives.
Another key theme is the way technology can reinforce and amplify existing power structures and inequalities. Bridle shows how AI and machine learning can encode and perpetuate human biases, how financial technologies can concentrate wealth in the hands of a few, and how mass surveillance can threaten individual privacy and freedom.
Throughout the book, Bridle emphasizes the importance of critical thinking and technological literacy. He argues that to navigate this new dark age, we need to question our technologies, understand their origins and functions, and consider their broader impacts on society and the environment.
"New Dark Age" is not a luddite rejection of technology, but rather a call for a more thoughtful and nuanced approach to technological development and use. Bridle encourages readers to embrace complexity, to resist oversimplified narratives, and to strive for a deeper understanding of the systems that shape our world.
In conclusion, "New Dark Age" offers a thought-provoking and timely analysis of our relationship with technology in the 21st century. It challenges readers to look beyond the surface of our digital tools and platforms, to consider their hidden costs and unintended consequences, and to take a more active role in shaping the technological future we want to see. While the picture Bridle paints can be daunting, his work ultimately empowers readers by providing the knowledge and perspective needed to navigate our increasingly complex technological landscape.