American Life Election 2016 Media Storm Chaser Network

Ideological Primates

Written by theantichamberchat

Why are people on the internet so angry about politics?

By now, many people who engage with political material online are aware that exposing oneself to alternative perspectives, either by following a wider variety of news sources or participating in bipartisan discussions, rather than ensconcing oneself in an echo chamber, is critical for maintaining a balanced model of reality.

And yet, comment sections on blogs, Facebook, Youtube, and Twitter nearly ubiquitously remain bereft of etiquette and nuance.

Internet denizens have a reputation for nastiness when it comes to politics. It would appear as if they’ve given up on trying to reach anyone who isn’t already on board with their ideology. Much more energy spent in vitriolic and emotional displays of frustration and revulsion towards the opposition, than in honest argument or promotion of mutual enlightenment.

There are two large problems I see standing in the way of reasoned, respectful dialogue. These are:

  1.  We don’t swiftly update or abate our beliefs
  2.  We don’t understand why we believe what we do well enough to convincingly argue for our positions.

Primitive man surfs the web

The leading theory on the evolutionary history of our logical faculties proposes that our brain, far from developing specifically for abstract philosophizing or data evaluation, evolved its logical capabilities primarily in response to the problems of group cooperation.

In light of this view, we can make sense of many of the cognitive biases that plague our belief-forming mechanisms.

In our early history, when all of our time was spent gathering food and fending off predators, failure to form or join a group would be the difference between life and death. Social dynamics of deception and trust likely ignited a cognitive arms race that resulted in our impressive abilities to detect inconsistencies and reason abstractly.

Under these conditions, we would evolve to be skeptical of outsiders, so as not to fall for their tricks, and acquiescent to the ingroup, for our well-being would be dependent on their trust. While the stakes have been lowered dramatically, these cognitive processes remain mostly, if not completely, in tact today. This would explain behavior ranging from racism to sports fanaticism. I recommend reading about the Robbers Cave experiment for more on our obsession with factions.

We also developed sophisticated machinery for navigating complex social relations within a group. Our ancestors would have done things like fake allegiance to the group, switch sides,  and compromise – behaviors that require advanced cognitive abilities.

How does the way reach our conclusions compare to the way theories come to be accepted by science?

Whenever a groundbreaking discovery is claimed, it could take decades or generations for scientists to examine the methodology used, reproduce the results, look for holes, etc., before the claim is generally regarded as true. Nicholas Copernicus presented his evidence and arguments for a heliocentric model of the solar system in 1514, but it took more than a century for his theory to become widely accepted.

This delay may seem burdensome, but the fact that science doesn’t automatically accept evidence and claims without a lengthy debate spanning years, if not lifetimes, is a feature of science, not a bug.

Likewise, the human brain does not reverse on its core beliefs without good reason. Instead, the brain takes new information, considers how this information fairs with what the brain already believes, and depending on how consistent it is with prior understandings and how disruptive it would be to existing structures, the brain will classify the information according to its level of plausibility, and store it for future reference.

If the information is easy to swallow – if it’s consistent with what I know and I have no reason to doubt it – it is quickly accommodated. But if the information contradicts what I know, instead of automatically accepting the idea, I may decide to investigate further, or to reject the information outright.

One difference between our brain and science is that if there isn’t a real perceived consequence to having bad information, our brain doesn’t care to verify its beliefs. For example, I may have a poor understanding of how toilets work, but there isn’t a readily conceivable situation where not knowing the truth about toilets would be detrimental. 

The backfire effect

The backfire effect, the effect in which, “corrections actually increase misperceptions among the group in question”, demonstrates another of the flawed ways in which our brain filters information. In these studies, participants were first given a (fake) article they agreed with, then later given an article from the same (fake) source correcting and discrediting the information from the first article. But instead of accepting the corrections, their convictions grew stronger. “People on opposing sides of the political spectrum read the same articles and then the same corrections, and when new evidence was interpreted as threatening to their beliefs, they doubled down. The corrections backfired … When someone tries to correct you, tries to dilute your misconceptions, it backfires and strengthens them instead. Over time, the backfire effect helps make you less skeptical of those things which allow you to continue seeing your beliefs and attitudes as true and proper.”

If your views on a given political issue are in reality inconsistent with the facts about the matter, there isn’t usually any consequence, therefore that incorrect belief will not cost you anything. It’s often much more costly to question beliefs and properly investigate troubling information, so when the evidence presented doesn’t fit with what we know, our brain, in its effort to optimize the allocation of resources, makes the executive decision to rationalize it away.

Fortunately, it is still possible to change even our most foundational beliefs. It just takes being aware of the nature of our cognition.

The illusion of explanatory depth

On the other hand, we don’t really know the rationale behind our beliefs that well. The term “illusion of explanatory depth” was coined by Yale researchers in a psychological study where participants were asked to judge their confidence in their knowledge across various domains and explain their theories for how complex phenomena work. They found that that people were largely overconfident about their understanding of devices and natural phenomena.

Though being overconfident in our knowledge may sound like a defect, like our sluggish process of updating belief-structures, this is, in fact, another feature of the brain. We have limited storage capacity and limited time and resources for investigating every single one of our beliefs, so, rightly,  we defer to professionals, experts, or superficial mental models where we can, and the edge between our own knowledge and outsourced knowledge is not really important in most cases.

Even if we’re incorrect about the details, we can get through the day without ever needing to question or reflect on say, how a refrigerator works. It is more efficient to create a rudimentary mental model that provides the minimal amount of information necessary for successfully interacting with the object or idea we’re trying to interact with than to know everything about everything.

A Harvard study exploring the illusion of explanatory depth in the political realm concluded that “people’s mistaken sense that they understand the causal processes underlying policies contributes to political polarization” and that people, “have unjustified confidence in their understanding of policies.” When people were made to explain the implications their policy would have, they realized how little they really knew and retreated to more moderate positions.

Getting into an honest discussion with someone can be a debiasing process. As you defend your beliefs, you are required to search your brain for reasons supporting that belief, and if your search comes up short,  you can recalibrate your confidence in your beliefs and gain a better sense of what specifically you might want to investigate in more detail. 

Arguing on the internet can be the most productive interaction you have with regards to learning new perspectives and discovering the nature of your own beliefs, but it must be undertaken with an awareness of our primate brain’s inclinations and weaknesses. was developed in response to these sort of concerns. A good way to sharpen your thinking is by scraping against positions you find disagreeable. Argue your beliefs against a stranger in an anonymous one-on-one chat.

Leave a Comment

About the author



TheAntiChamber was created to combat the effect of filter bubbles and echo chambers by providing a space for open-minded people to chat anonymously about critical issues pertaining to how we live our lives and how we organize our societies. Engage in live political chat.