
The Internet, designed to connect humanity has ended up amplifying division and increase hatred because the Palaeolithic brain of Homo sapiens did not evolve for the Internet.
For decades, technology has connected people across the globe. In the past, most people lived in small villages, with the nearest city being their farthest destination. Today, people can dream of travelling to a different continent in just a few hours. Just a few decades ago, faraway countries were unheard of; now, you can explore their geography and culture through a smart device in your palm. Sitting in India, you can video call a friend in the US, something unimaginable when we were born. Thanks to the Internet.
One might have thought that the Internet would connect different cultures like a web, fostering understanding and appreciation. Free speech and connectivity would make the world more liberal. Capitalists hoped that the Internet would increase globalisation, it would increase people’s aspirations (read greed), and the consumer market would grow. Unfortunately, the opposite has happened. While capitalists did benefit from the growing aspirations of the young generation, divisions amongst different groups have reversed globalisation, and given birth to cultural silos, and polarized online communities. The Internet hasn’t just connected us; it’s exposed fault lines. Almost any online post has abusive comments meant to insult.
Harare thinks that technology will drive our evolution. Maybe he is right. But as of now, evolution is finding it hard to keep pace with technology. Technology connects you with a lot more people than our brain evolved to comprehend.
The Dunbar number, proposed by anthropologist Robin Dunbar, is a theoretical cognitive limit to the number of people with whom an individual can maintain stable social relationships—relationships where you know who each person is and how they relate to others. It’s typically pegged at around 150, though it can range from 100 to 250 depending on context and individual differences.
In practice, the Dunbar number breaks down into layers of intimacy:
• 5: Close support group (e.g., best friends, immediate family).
• 15: Sympathetic friends (people you turn to for emotional support).
• 50: Close social circle (regular contacts, like a small community or team).
• 150: Stable relationships (people you know well enough to trust and interact with regularly).
Beyond 150, relationships tend to become less personal, requiring formalised structures (like institutions or hierarchies) to manage. Dunbar’s research suggests this limit stems from cognitive constraints—our brains can only track so many social connections before they become shallow or unmanageable.
Instead of 150, social media connects us to thousands of “friends” and millions of strangers. Apart from people, there’s a near-infinite load of information. How does the brain handle the Internet’s social and informational deluge that it’s not wired for?
Our brain knows only one way to handle huge data – it simplifies.
Kahneman’s System 1 (fast thinking) is intuitive, automatic, and relies on heuristics—mental shortcuts shaped by experience and biases. It’s designed to reduce cognitive load, letting us make quick decisions in high-pressure or complex situations. System 2 (slow thinking), by contrast, is deliberate and analytical but requires effort and energy, so we default to System 1 whenever possible.
In our evolutionary past, fast thinking was a survival tool. When encountering strangers, our Paleolithic brains quickly assessed: friend or foe? This relied on cues like appearance, behaviour, or group affiliation, often within small communities capped at Dunbar’s number. These snap judgments worked well in tight-knit groups where trust and reciprocity were clear.
When our Palaeolithic brain encounters a large group of people today, it stops seeing individuals and clubs people into groups, friend or foe, like when they were hunters and gatherers. This manifests as ideological tribes—political factions, nationality, religion, fandoms, or cultural cliques. This helps the fast-thinking brain to judge a person quickly. We try to reduce the credibility of the individual by pointing out wrongdoings of some random member of the group.
This stereotyping is a classic System 1 heuristic. Instead of evaluating a person’s unique traits, our brain assigns them to a group based on minimal cues—a profile picture, a hashtag, or a single post on social media. For example:
• A liberal-sounding tweet? System 1 tags the user as “leftist.”
• A religious symbol in a bio? They’re slotted into a faith-based tribe.
This reduces cognitive load but sacrifices accuracy. Online, where interactions lack the depth of face-to-face encounters, these snap judgments amplify biases and fuel tribalism.
Algorithms exacerbate this, curating content that reinforces biases and triggers emotional responses like anger, which drives engagement (studies show negative emotions increase clicks and shares). A 2021 Science Advances study found that divisive content spreads faster, turning every comment thread into a potential warzone.
Instead of fostering empathy, the Internet often dehumanizes. Text-based platforms strip away nonverbal cues, making it easier to abuse a message with a display picture.
Pre-filtered humans processed information slowly, through stories or direct experience. Now, we’re bombarded with data—news, memes, posts, ads—far exceeding our cognitive bandwidth. The brain’s prefrontal cortex, responsible for filtering and contextualizing, gets swamped. To cope, we rely on mental shortcuts: skimming, outsourcing judgment to influencers, or clinging to simple narratives. This fuels misinformation and polarization, as nuanced ideas get lost.
Online, these tribes clash in echo chambers, where algorithms amplify outrage and dehumanize opponents. Constant online conflict fuels anxiety and depression. WHO data from 2024 links social media overuse to rising mental health issues, especially among youth.
Unfortunately, the bad experience online spills into real life. Reel-life hatred leads to real-life violence. Studies, like one from PNAS (2018), show that dehumanizing language online correlates with increased hostility, which can translate to physical aggression. For example, hate speech targeting minorities on social media platforms has been linked to spikes in hate crimes, as seen in FBI data post-2016 U.S. election. A huge number of Muslims are recruited through online propaganda by Islamic radicals. The 2019 Christchurch mosque shooter, for instance, was radicalized in part through online forums where anti-immigrant rhetoric thrived. An RPF constable shot at three Muslims inside a moving train because he was brainwashed by social media. Recently, a Muslim man beat up his small kids because he played with Hindu kids.
Hatred is on the rise worldwide. Rising inequality, climate anxiety, and political instability amplify tensions. The Internet becomes a pressure valve, but instead of relief, it escalates conflict. Almost any post can be turned into an online battlefield.
Number of protests are increasing globally, often ending in riots. A 2023 Pew Research study showed declining trust in institutions and neighbors in polarized nations, a trend that risks societal collapse if unchecked. This cannot continue forever. Something’s gotta give.
No single fix will end online hatred, but a combination of platform reform, user-driven change, and cultural shifts is plausible. Fatigue is already pushing some users toward smaller, and kinder online spaces. However, without deliberate action, the cycle could worsen before it breaks—escalating violence or authoritarian crackdowns could be the “give” instead of progress.
Maybe we need an empathy movement. Grassroots efforts to humanize online interactions—like cross-cultural dialogues or “de-escalation” influencers might be the way forward.
To summarise, the Internet, meant to connect humanity, amplifies hatred and tribalism by overwhelming our brains, evolved for Dunbar’s ~150-person social limit, with thousands of interactions and information overload. Fast thinking (System 1), as described by Kahneman, relies on biases to group people into “us” vs. “them,” fueling online battles that spill into real-world violence, like hate crimes and riots. This escalation is unsustainable, eroding social cohesion, mental health, and economic stability, demanding a breaking point. Potential solutions include platform reforms to prioritize empathy, digital literacy to counter biases, and rebuilding offline communities to restore human connection. Without deliberate change, the cycle of hatred risks worsening, but user-driven shifts and technological tweaks could steer the Internet toward unity.
Remember that the person on the other end of the digital platform is just another human being trying to make sense of the mad world and survive.


