Amidst a sea of information, we seem to be plunging into a new dark age: a time when we know more but understand less.
1. The Military's Quest to Control the Weather Birthed Modern Computing
The roots of modern computation lie in military endeavors to control and predict the weather. During World War I, Lewis Fry Richardson imagined a giant "computing machine" operated by humans to predict weather patterns. His concept was visionary but required technology far beyond its time.
This vision took shape during World War II when military funding propelled advancements in computational machines like ENIAC. These machines were designed to run weather and weapon simulations, tying together the interests of meteorology and warfare. Secret programs, like IBM's SSEC, conducted classified computations for hydrogen bombs under the guise of public curiosities.
Despite the advancements, these machines had significant flaws. Their simplistic understanding often led to errors with devastating potential. For instance, during the Cold War, the SAGE system in the US mistaken migrating birds for Soviet bombers, illustrating how military tech missteps could nearly lead to catastrophe.
Examples
- Lewis Fry Richardson's vision of weather computation during World War I.
- ENIAC’s role in simulating bomb impact scenarios.
- SAGE’s erroneous identification of bird migrations as enemy planes.
2. Technology and Climate Change Are Inescapably Linked
Our technology has environmental consequences, while climate changes challenge the functioning of our networks. The Syrian conflict, often dubbed the first "climate war," demonstrates the interplay of unpredictable weather, societal tensions, and eventual chaos. Droughts, aggravated by rising temperatures, drove farmers to cities, fostering discontent that escalated.
Digital platforms, often thought of as weightless clouds, rely on energy-intensive data centers and physical infrastructures vulnerable to extreme weather. Even high temperatures impact everyday tools like WiFi strength. At the same time, maintaining digital services contributes significantly to carbon emissions, consuming energy on par with major appliances like refrigerators.
Alarmingly, the increasing pace of climate change might limit human cognition, especially as CO2 levels rise. For instance, indoor workspaces with high CO2 concentrations see reduced mental function, pointing to this invisible but growing challenge in technology-reliant societies.
Examples
- The Syrian drought (2006-2011) linked to civil instability.
- A single hour of Netflix streaming can use as much energy as two refrigerators annually.
- CO2 levels over 1000 ppm found to reduce human cognitive capability by 21%.
3. Overreliance on Big Data Disrupts Science
The promise that "more data equals better results" is undermining scientific progress. Moore's law suggests computing power doubles every two years, but this computational growth hasn’t solved inefficiencies. In drug research, for example, High-Throughput Screening (HTS) uses machines to generate massive datasets, skipping human empirical methodologies.
Paradoxically, a surplus of data generates a replication crisis, where many studies fail to produce consistent results. The cancer research conducted by the University of Virginia exposed this reality, as only two out of five landmark studies proved reproducible upon reexamination.
Additionally, the overload of information has slowed down discoveries. While the number of papers and studies balloons, plagiarism, errors, and fraud rise, illustrating science’s struggle to handle its own data deluge.
Examples
- Researchers nicknamed the diminishing drug research ROI “Eroom’s law.”
- Only 40% of attempted cancer study replications succeeded.
- The rise in papers corresponds with more scientific misconduct.
4. Capitalistic Technology Worsens Inequality
Instead of equalizing opportunities, technological tools often amplify disparities. Financial algorithms, for example, drive high-frequency trading, allowing instant transactions unattainable for ordinary traders. This gives corporations unfair advantages and causes crashes like the $6 billion flash crash of May 10, 2010.
In physical labor sectors, companies like Amazon leverage technology for efficiency, reducing humans to machine-like roles. Workers in Amazon warehouses follow instructions from devices that optimize productivity at the cost of personal well-being.
The lack of governmental solutions for this digital upheaval leaves most workers vulnerable. As automation spreads, meaningful employment opportunities diminish. Far from being allies of equality, technologies have become tools enhancing disparities.
Examples
- High-frequency trading algorithms causing sudden flash crashes.
- Amazon’s warehouse pickers being monitored like robots.
- Politicians failing to address future social safety net issues caused by automation.
5. Machines Learn Our Biases and Carry Them Forward
Artificial intelligence (AI) does not think like humans; it amplifies patterns based on flawed datasets from the past. A famous US Army experiment with AI-trained to spot camouflaged tanks showed this flaw – the system identified sunny-day photos, not actual tanks.
Algorithms are perceived as impartial, but their outputs often reflect human prejudices. The Shanghai experiment, which claimed AI could distinguish criminals by facial features, disguised its own embedded injustices. Critics like those flagging Nikon cameras’ inability to process Asian faces highlight how these biases extend into technologies we use daily.
AI reflects historical inequalities rather than solving them. Without addressing this, machine learning risks creating a future that preserves and enlarges past biases.
Examples
- AI-trained tanks identified weather patterns, not objects.
- Nikon cameras misunderstood Asian facial expressions.
- Shanghai researchers' biased criminal identification software.
6. Technology Increases State Power and Secrecy
Intelligence agencies drive the development of covert technologies and hoard data, shaping history in unaccountable ways. Around 400,000 US documents are stamped secret each year, compounding a hidden history. The British government’s concealment of atrocities in Kenya’s colonial camps amount to erased narratives that hinder accountability.
Programs exposed by whistleblowers like Edward Snowden revealed mass surveillance far beyond public awareness. Yet, public outrage was transient, and little systemic change followed. Complex as climate change, "big surveillance" overwhelms individuals, leading to apathy.
This power imbalance between citizens and states endures, as governments control more of the tools and records capable of documenting or determining societal futures.
Examples
- British "burn certificates" erased documentation of colonial crime in Kenya.
- The CIA led preemptive drone development long before military adoption.
- Snowden exposed large-scale, global citizen surveillance.
7. Conspiracy Theories Offer Comfort in Complexity
Human instinct favors simplistic stories over complicated truths, especially in today's chaotic social media landscape. Chemtrail conspiracies adapt visible phenomena like plane contrails into a simplistic narrative of government disease-spreading plots, ignoring aviation's verifiable contributions to carbon emissions.
Definitions of surveillance, like those surrounding gang-stalking, exaggerate anecdotes despite the verified existence of mass surveillance systems revealed by the NSA. This highlights how distorted tales replace deeper examinations of power structures.
Reinforced by echo chambers, conspiracies soothe anxieties by packaging chaos into digestible whimsy. From climate change denial fueling populism to Alex Jones’ influence on Trump’s messaging, these stories gain worrying real-world political influence.
Examples
- Chemtrails as mistranslated aerial exhaust.
- NSA surveillance paranoia inspiring “gang-stalking belief groups.”
- Trump alluding to Chinese climate conspiracies.
8. Algorithms Spawn Disturbing Content for Profit
Algorithms, paired with YouTube’s monetization system, exploit societal vulnerabilities for financial gains. Children's video content, often auto-generated by bots, floods the internet with bizarre or unsettling media aimed to maximize clicks. Titles packed with trending keywords appeal to algorithms, creating meaningless mash-ups for better visibility.
Even parody-like content, including violent reinterpretations of characters such as Peppa Pig, ends up targeting children due to lack of platform moderation. This highlights how profit-driven systems undermine social judgment, potentially exposing young audiences to psychologically harmful media.
Advertisers and algorithm designers prioritize engagement, leaving vast digital spaces populated by outputs disconnected from human wellness or innovation.
Examples
- Randomized “surprise egg” YouTube titles for child-themed video engagement.
- Parody Peppa Pig distressing content appearing in search results.
- Bots behind over 1,000 Little Baby Bum-like content chains.
9. Understanding Complexity Helps Us Respond More Wisely
Rather than believing computation simply "fixes" global problems, people must reckon with complexity. Eric Schmidt wrongly suggested smartphones could prevent events like the 1994 Rwandan genocide simply through visibility. Historical context, however, invalidates such simplistic claims, as governments actively tracked yet ignored early warnings about the massacre.
Clive Humby’s analogy – "data as oil" – explains that unrefined numbers hold little value without focused understanding. A shift away from obsessive data collection toward meaningful critique is essential.
Societies may find more constructive progress by questioning assumptions about who controls, limits, or uses technology and reevaluating these dynamics.
Examples
- High-resolution satellite captures existed for the Rwandan traps but lacked global interventions.
- Schmidt’s flawed optimism over technology visibility templates.
- Founders like Humby emphasizing focus as vital to technology meaningfulness.
Takeaways
- Question who owns, processes, and oversees emerging digital tools or platforms to assess how they may bias or shift public narratives.
- Avoid over-reliance on large datasets; ensure comprehension and refinements over the 'raw quantity' processing overload cycles.
- Actively balance understanding rapidly generated AI/algorithm-driven content zones online combined critically weighting social harm/potential effects policymaker boundaries aligned.