• عربي
  • Fr
  • Es
No Result
View All Result
European Eye on Radicalization
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
No Result
View All Result
European Eye on Radicalization
No Result
View All Result
Home Opinion Pieces Contributors’ Opinions

Algorithms and the Rise of Extremist Online Homophilic Communities

24 November 2021
in Contributors’ Opinions, Opinion Pieces
Algorithms and the Rise of Extremist Online Homophilic Communities
314
VIEWS
Share on FacebookShare on Twitter
Article author: Isaac Kfir

Isaac Kfir, Advisory Board, International Institute for Justice and the Rule of Law and Adjunct Professor, Charles Sturt University

 

Social media and technology have destroyed age-old restrictions that prevented people from connecting, as geography and access has limited the ability of people to form transnational homophilic communities. Nowadays, with the click of a button, individuals can find solidarity with people living on the other side of the world, as they undertake their ‘own research to find truths’. However, in doing so, they may also end up down the rabbit hole.

Several components facilitate the development, growth, and sustainability of online homophilic communities. Algorithms are key to the process. These sets of functions help to push content as their stated purpose is to make the user’s life easier by tracking one’s online engagement and offering possibilities often through search engines. That is, before 2004, as the World Wide Web was in its infancy, content creators were limited, content was static, and searching for information was laborious and challenging as there was no ranking in how information was presented.

Understanding the algorithm-violent extremism nexus requires an appreciation of the power of search engines. Professor James Grimmelmann correctly points out that engine searches are seen as trusted advisers that perform a search in an expert and objective manner that results in a list of recommended websites that the user should visit (and ones that the user should ignore).

The Problem with Algorithms

The reality, however, is very different. Search engines are powered by algorithms (specifically web crawlers). Programs such as PageRank trawl through the world wide web looking for the information according to the words that the user had placed in their search query. PageRank — an algorithm that has powered Google’s rise and dominance — is a case in point. PageRank scours billions of web pages to answer the user’s request for information. The algorithm then determines how to present and rank the information it finds by looking at how many users have clicked on a specific webpage — the more clicks, the higher the webpage would appear. The program has no interest in the veracity of a webpage but rather whether it meets the general search parameters and the popularity of the webpage because, for algorithms, it is quantity, as opposed to quality, that matters.

YouTube, which is owned by Google, provides an excellent case study of the power of algorithms and how they can push one down the rabbit hole. What drives YouTube’s success is the recommender algorithm, as developers discovered that by understanding what users would want to watch, they can extend one’s usage of the platforms. In 2018, Neal Mohan, YouTube’s Chief Product Officer, admitted that more than 70% of the videos that users watch is sent to them through the platform’s ‘recommender’ algorithm.

Selective Exposure

Another important factor in facilitating the rise of homophilic communities is selective exposure to information. Algorithms are important to this process because, through such features as ‘Group’ or ‘like/dislike’ buttons, platforms such as Facebook track and record user’s interests to provide users with more content that the user would want to consume.

Secondly, by recording the searches and interests of a user, algorithms filter information that they determine the individual would not be interested in. This is done because the goal is to feed the user information that is pertinent or useful to them, under the assumption that it would lead the user to continue using the platform. Concomitantly, because the platform provides the individual with the information that they want, they will continue to use and trust it. Interestingly, an internal Facebook study found that 64 percent of those who joined an extremist group only did so because the company’s algorithm recommended it to them.

Thirdly, it is important to remember that online engagement is all-encompassing. It looks to nourish one’s pursuit of happiness, which social psychology suggests is possible through hyper concentration. Platforms and webpages look to fully immerse the user, as they know that by engaging with all of one’s senses, the user reaches transcendence, giving them the happiness that they wish to replicate.

Automated Content Moderation

Due to the vast amount of content, whether video, image, text or audio, platforms have had to turn to automated content moderation. It is worth noting that, in the first six months of 2021, the world population generated, on average, approximately 500 million tweets, over 290 billion emails, 4 million gigabytes of Facebook data, 65 billion WhatsApp messages. Every day we watch more than a billion hours of content on YouTube. This amounts to around 59 zettabytes. By 2025, we are expected to generate 175 zettabytes.

With so much data being created and uploaded, it is simply impossible for people to assess the content and determine whether it violates terms of use and community standards, which is why platforms rely on automated content moderation, operating through Artificial Intelligence, machine learning, and the like. The purpose of these tools is to organize, curate, filter, classify and, if necessary, remove or block the content.

There are many obvious issues with automated content moderation ranging from the technical to the political. However, the way violent extremists adapt to algorithmic content moderation is a greater concern. In one Islamic State propaganda video, there was a 30-second introduction of the France 24 news channel in the hope of avoiding detection and removal (the video was 49-minutes long). Supporters of the Islamic State, for example, have responded to Telegram’s efforts to detect extremist materials. Telegram has focused on specific formats, in other words, the content, but not at assessing the services that the platform offers. Consequently, content moderation algorithms look to one type of file for malicious content instead of looking at the many services that Telegram offers.

Conclusion

In summary, responses to online violent extremism have been reactive and discombobulated. It is not about assessing what is coming but rather dealing with what has taken place. So, when a terrorist uses live-stream in their attack, one response is to get Facebook and the London Police ‘to create technology that could help identify firearm attacks in their early stages and potentially help police across the world in responding.’ Another is to formulate and adopt the Christchurch Call or a change of the recommender algorithm as YouTube had done in 2019, in the hope of discouraging users from watching borderline content — a reference to content that may be deemed harmful or misleading, but which is not violent.

However, these initiatives fail to discourage individuals from going down the rabbit hole. What is desperately needed is an understanding of the power of algorithms, the role that they play in search engines, and how responses to searches can lead to an eschewed vision because the results provide one specific type of information. Unless we address the power that algorithms have on the searches and the layperson’s lack of understanding of the power of algorithms and the way information is presented, our ability to effectively hinder the radicalization process will be limited at best.

European Eye on Radicalization aims to publish a diversity of perspectives and as such does not endorse the opinions expressed by contributors. The views expressed in this article represent the author alone.

Related Posts

Spotlight on Contemporary Antisemitism
Contributors’ Opinions

Spotlight on Contemporary Antisemitism

27 January 2023
Preventing and Countering Right-Wing Extremism: Current and Upcoming Challenges
Contributors’ Opinions

Preventing and Countering Right-Wing Extremism: Current and Upcoming Challenges

26 January 2023
Sweden’s Free Speech Debate and the Growing Far-Right Movement
EER Editorials

Sweden’s Free Speech Debate and the Growing Far-Right Movement

24 January 2023
The Taliban’s Repression of Women is Not Only a Problem for Afghanistan
EER Editorials

The Taliban’s Repression of Women is Not Only a Problem for Afghanistan

17 January 2023
The Islamic State and the Question of “Lone Wolves”
EER Editorials

The Islamic State and the Question of “Lone Wolves”

9 January 2023
A Conspiracy in Germany Highlights Far-Right Trends
EER Editorials

A Conspiracy in Germany Highlights Far-Right Trends

12 December 2022

Latest from Twitter

Popular

The Challenges of Combatting Extremist Financing in Germany

The Challenges of Combatting Extremist Financing in Germany

6 January 2023
The Myth of the Remote-Controlled Car Bomb

The Myth of the Remote-Controlled Car Bomb

16 September 2019

Taliban: Structure, Strategy, Agenda, and the International Terrorism Threat

7 October 2022
How a Swedish Agency Stopped Funding the Muslim Brotherhood

How a Swedish Agency Stopped Funding the Muslim Brotherhood

5 September 2022
Becoming Ansar Allah: How the Islamic Revolution Conquered Yemen

Becoming Ansar Allah: How the Islamic Revolution Conquered Yemen

24 March 2021
The Role of Online Communities in the Expansion of Far-Right Extremism

The Role of Online Communities in the Expansion of Far-Right Extremism

3 November 2022

© 2018 EER - Copyright © European Eye on Radicalization.

No Result
View All Result
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
  • عربي
  • Fr
  • Es

© 2018 EER - Copyright © European Eye on Radicalization.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.