
Isaac Kfir, Advisory Board, International Institute for Justice and the Rule of Law and Adjunct Professor, Charles Sturt University
Many western countries are grappling with a sharp rise in support for extremist ideas and causes. It is possible quarantines and lockdowns have spurred support for extremist views. This is because people spend more time online, not only for work and education, but also to look for ways to make sense of the world. In doing so, many increasingly reject official explanations in favor of more outlandish ones, such as assertions of a link between 5G and Covid-19.
One explanation for extremism is the Cognitive Dissonance Theory (CDT). This theory was developed in the 1950s by social psychologist, Leon Festinger. In studying Dorothy Martin’s apocalyptic cult The Seekers, Festinger sought to explain inconsistencies between actions and beliefs. He argued that humans go to great lengths to rationalize things/events to fit with their worldview. That CDT is visible across the violent extremist spectrums, is unsurprising. Extremist positions are comforting; they lay out clear demarcations between good and evil, as well as the in-group and the out-group.
Social Media Homophily
The Communication Revolution encouraged cognitive dissonance. At its inception, the communication revolution intended to empower and enable people to share information (raw, unrefined data) and knowledge (information organized in a specific way to elicit an understanding). That is, the advent of the internet and social media were meant to expose users to differing opinions and ideas, making them more informed and educated, and thus encouraging the marketplace of ideas.
The proliferation of social media platforms is crucial to understand the rise of extremist narratives, because they allow users to post whatever information they want, leading them to form homophilic social connections. In other words, social media users gravitate towards communities, groups and narratives that reflect their worldview.
Social media homophily stems from the business model of many social media companies. The key to their growth is a proprietary algorithm that identifies preferences. The algorithm, designed by humans, identifies patterns of behavior. The stated purpose was to make the user’s life easier, as the algorithm tracked what sites one used or what one posts or what one liked or hated.
By looking at the activities, the platforms through their algorithm provided the individual with information that had been determined was of interest to them. This has given rise to algorithmic culture: the gradual abandonment of culture’s publicness and the emergence of a strange new breed of elite culture purporting to be its opposite. Simply, the sorting, classifying and hierarchizing of people, places, objects, and ideas have been transferred to machines built to identify patterns in behavior.
These machines do not consider the value or even the varsity of the information. Consequently, because the algorithm fails to distinguish between fact-tested information and misinformation, it facilitates the emergence of echo-chambers that serve as feedback loops and filter bubbles, encouraging members to see the world in a specific, often non-empirical manner that feeds off negative emotions.
An Inherent Need for ‘Truth’
To understand the link between CDT and extremism we must acknowledge that people have an inherent need to find explanations or ‘truths’ to make sense of the world. These become even more urgent in times of crisis. In searching for their ‘truths’, the individual begins with a narrative based on a sanitized, idealized version of history that feeds their dissonance. Adherents believe that ‘their’ society or ‘their’ world has been hijacked by sinister entities.
In adopting this reality, individuals are left with a binary option: take the ‘blue pill’ and remain ignorant of the ‘truth’ or take the ‘red pill’ and acquire the truth. With the latter option comes the need to ‘fight’ the system, which calls for the ‘red-pilling the normies’ or violence aimed at toppling the system, so as to emancipate those ignorant of the ‘truth’.
Edgar Maddison Welch provides a good example. On December 4, 2016, he walked into Comet Ping Pong with an AR-15 rifle, determined to investigate claims that the restaurant was part of a plot involving child sex trafficking, the occult and Hillary Clinton’s presidential bid. Welch believed he was doing a public service because the authorities had been corrupted by a secret cable, and someone needed to stop the wrong doings.
A crucial element in understanding the appeal of one’s quest for ‘truth’ is the social endorsement heuristic. The concept relates to the fact that the internet serves as a trove of information, but humans cannot critically process all of it. Therefore, users rely on social connections in determining the credibility or the ‘truth’ of a piece of information that often lacks empirical veracity. Simply, when looking at traditional news formats, the presence of professional training, fact-checkers and the threat of legal sanctions, ensure that what is disseminated has been assessed. With social media, no such checks or gatekeepers exist, allowing the wildest ideas to gather momentum simply because enough people share it. Concomitantly, it is also notable that individuals are already predisposed to accept a certain type of information if it confirms our beliefs (confirmation bias) as opposed to information that challenges it.
Conclusion
In sum, over the last decade or so, enormous effort has gone to understand and explain the pull and push factor that drives radicalization and violent extremism. There has been great progress in challenging misinformation, as seen with prebunking or attitudinal inoculation. There have also been efforts to provide people with the analytical tools to engage with any material. These are all invaluable tools to counter misinformation and radicalization that could lead one towards violent extremism. And yet, what we must also recognize is that those who embrace violence extremism may also have cognitive dissonance.
Social media companies have taken some measures by introducing gatekeepers, new terms of service, using AI, putting up warnings and removing or blocking groups or posts. These measures have made some difference in that some users recognize that not everything they read is true, but it also means that people therefore engage in ‘their own research’ which instead of offering new, critical assessment, simply sends one further down the rabbit hole.
To resolve or limit cognitive dissonance calls for more drastic action because the process involves either changing one’s belief, adding a new belief or reducing the importance of the belief. Such actions are not easy, as these ideas and beliefs form not only the individual’s identity, but that of their social group and the social world they inhabit.
European Eye on Radicalization aims to publish a diversity of perspectives and as such does not endorse the opinions expressed by contributors. The views expressed in this article represent the author alone.