• عربي
  • Fr
  • Es
No Result
View All Result
European Eye on Radicalization
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
No Result
View All Result
European Eye on Radicalization
No Result
View All Result
Home Opinion Pieces Contributors’ Opinions

Algorithms and the Rise of Extremist Online Homophilic Communities

24 November 2021
in Contributors’ Opinions, Opinion Pieces
Algorithms and the Rise of Extremist Online Homophilic Communities
204
VIEWS
Share on FacebookShare on Twitter
Article author: Isaac Kfir

Isaac Kfir, Advisory Board, International Institute for Justice and the Rule of Law and Adjunct Professor, Charles Sturt University

 

Social media and technology have destroyed age-old restrictions that prevented people from connecting, as geography and access has limited the ability of people to form transnational homophilic communities. Nowadays, with the click of a button, individuals can find solidarity with people living on the other side of the world, as they undertake their ‘own research to find truths’. However, in doing so, they may also end up down the rabbit hole.

Several components facilitate the development, growth, and sustainability of online homophilic communities. Algorithms are key to the process. These sets of functions help to push content as their stated purpose is to make the user’s life easier by tracking one’s online engagement and offering possibilities often through search engines. That is, before 2004, as the World Wide Web was in its infancy, content creators were limited, content was static, and searching for information was laborious and challenging as there was no ranking in how information was presented.

Understanding the algorithm-violent extremism nexus requires an appreciation of the power of search engines. Professor James Grimmelmann correctly points out that engine searches are seen as trusted advisers that perform a search in an expert and objective manner that results in a list of recommended websites that the user should visit (and ones that the user should ignore).

The Problem with Algorithms

The reality, however, is very different. Search engines are powered by algorithms (specifically web crawlers). Programs such as PageRank trawl through the world wide web looking for the information according to the words that the user had placed in their search query. PageRank — an algorithm that has powered Google’s rise and dominance — is a case in point. PageRank scours billions of web pages to answer the user’s request for information. The algorithm then determines how to present and rank the information it finds by looking at how many users have clicked on a specific webpage — the more clicks, the higher the webpage would appear. The program has no interest in the veracity of a webpage but rather whether it meets the general search parameters and the popularity of the webpage because, for algorithms, it is quantity, as opposed to quality, that matters.

YouTube, which is owned by Google, provides an excellent case study of the power of algorithms and how they can push one down the rabbit hole. What drives YouTube’s success is the recommender algorithm, as developers discovered that by understanding what users would want to watch, they can extend one’s usage of the platforms. In 2018, Neal Mohan, YouTube’s Chief Product Officer, admitted that more than 70% of the videos that users watch is sent to them through the platform’s ‘recommender’ algorithm.

Selective Exposure

Another important factor in facilitating the rise of homophilic communities is selective exposure to information. Algorithms are important to this process because, through such features as ‘Group’ or ‘like/dislike’ buttons, platforms such as Facebook track and record user’s interests to provide users with more content that the user would want to consume.

Secondly, by recording the searches and interests of a user, algorithms filter information that they determine the individual would not be interested in. This is done because the goal is to feed the user information that is pertinent or useful to them, under the assumption that it would lead the user to continue using the platform. Concomitantly, because the platform provides the individual with the information that they want, they will continue to use and trust it. Interestingly, an internal Facebook study found that 64 percent of those who joined an extremist group only did so because the company’s algorithm recommended it to them.

Thirdly, it is important to remember that online engagement is all-encompassing. It looks to nourish one’s pursuit of happiness, which social psychology suggests is possible through hyper concentration. Platforms and webpages look to fully immerse the user, as they know that by engaging with all of one’s senses, the user reaches transcendence, giving them the happiness that they wish to replicate.

Automated Content Moderation

Due to the vast amount of content, whether video, image, text or audio, platforms have had to turn to automated content moderation. It is worth noting that, in the first six months of 2021, the world population generated, on average, approximately 500 million tweets, over 290 billion emails, 4 million gigabytes of Facebook data, 65 billion WhatsApp messages. Every day we watch more than a billion hours of content on YouTube. This amounts to around 59 zettabytes. By 2025, we are expected to generate 175 zettabytes.

With so much data being created and uploaded, it is simply impossible for people to assess the content and determine whether it violates terms of use and community standards, which is why platforms rely on automated content moderation, operating through Artificial Intelligence, machine learning, and the like. The purpose of these tools is to organize, curate, filter, classify and, if necessary, remove or block the content.

There are many obvious issues with automated content moderation ranging from the technical to the political. However, the way violent extremists adapt to algorithmic content moderation is a greater concern. In one Islamic State propaganda video, there was a 30-second introduction of the France 24 news channel in the hope of avoiding detection and removal (the video was 49-minutes long). Supporters of the Islamic State, for example, have responded to Telegram’s efforts to detect extremist materials. Telegram has focused on specific formats, in other words, the content, but not at assessing the services that the platform offers. Consequently, content moderation algorithms look to one type of file for malicious content instead of looking at the many services that Telegram offers.

Conclusion

In summary, responses to online violent extremism have been reactive and discombobulated. It is not about assessing what is coming but rather dealing with what has taken place. So, when a terrorist uses live-stream in their attack, one response is to get Facebook and the London Police ‘to create technology that could help identify firearm attacks in their early stages and potentially help police across the world in responding.’ Another is to formulate and adopt the Christchurch Call or a change of the recommender algorithm as YouTube had done in 2019, in the hope of discouraging users from watching borderline content — a reference to content that may be deemed harmful or misleading, but which is not violent.

However, these initiatives fail to discourage individuals from going down the rabbit hole. What is desperately needed is an understanding of the power of algorithms, the role that they play in search engines, and how responses to searches can lead to an eschewed vision because the results provide one specific type of information. Unless we address the power that algorithms have on the searches and the layperson’s lack of understanding of the power of algorithms and the way information is presented, our ability to effectively hinder the radicalization process will be limited at best.

European Eye on Radicalization aims to publish a diversity of perspectives and as such does not endorse the opinions expressed by contributors. The views expressed in this article represent the author alone.

Related Posts

Will Biden Repeat Obama’s Mistakes With Iran?
Contributors’ Opinions

Will Biden Repeat Obama’s Mistakes With Iran?

13 May 2022
The International Community Must Help Egypt Against Rising Terrorism and Economic Shocks From the Russia-Ukraine War
EER Editorials

The International Community Must Help Egypt Against Rising Terrorism and Economic Shocks From the Russia-Ukraine War

12 May 2022
We Should Be Using Conservatism to Counter the Far-Right
Contributors’ Opinions

We Should Be Using Conservatism to Counter the Far-Right

10 May 2022
Violence as Distorted Communication: A Fresh Perspective on Modern Terrorism
Contributors’ Opinions

Violence as Distorted Communication: A Fresh Perspective on Modern Terrorism

9 May 2022
Why Boko Haram is Losing Ground in Nigeria
Contributors’ Opinions

Why Boko Haram is Losing Ground in Nigeria

3 May 2022
From Afghanistan to Ukraine: A New Future for Foreign Fighters in Europe?
Contributors’ Opinions

From Afghanistan to Ukraine: A New Future for Foreign Fighters in Europe?

2 May 2022

Latest from Twitter

Popular

Russia, Afghanistan, and the Islamic State Threat to Central Asia

9 March 2022
Islamist Extremism and Jihadism in Latin America: A Longstanding and Underestimated Phenomenon

Islamist Extremism and Jihadism in Latin America: A Longstanding and Underestimated Phenomenon

14 April 2022

Muslim Brotherhood and Khomeinism in Italy: The Told and the Untold

6 May 2022
Islamist Organizations in the United Kingdom: From the Rushdie Affair to Present Day

Islamist Organizations in the United Kingdom: From the Rushdie Affair to Present Day

17 January 2022
Becoming Ansar Allah: How the Islamic Revolution Conquered Yemen

Becoming Ansar Allah: How the Islamic Revolution Conquered Yemen

24 March 2021
Reflections on the ‘Islamic’ Dimension of Conflicts in the East and in France

Reflections on the ‘Islamic’ Dimension of Conflicts in the East and in France

3 November 2021

© 2018 EER - Copyright © European Eye on Radicalization.

No Result
View All Result
  • Home
  • About Us
    • Who We Are
      • Editorial board and staff
      • Contributors
    • Vision
  • Analysis
  • Opinion Pieces
    • EER Editorials
    • Contributors’ Opinions
  • Reports
  • Reading Radicalization
  • Events
    • EER Events
    • Events EER attends
  • Interviews
  • Videos
  • Contact
  • عربي
  • Fr
  • Es

© 2018 EER - Copyright © European Eye on Radicalization.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.