Social media networks like Facebook and Instagram announced in March that they want to ban extreme-right organizations and leaders from spreading hatred against immigrants and ethnic minorities on their platforms. This is a change from a year ago, when Mark Zuckerberg, Facebook’s CEO, took a much more absolutist stance about free expression. “I’m Jewish, and there’s a set of people who deny that the Holocaust happened,” said Zuckerberg. “I find that deeply offensive. But at the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong.”
But the terrorist attack in New Zealand on March 15 has sped up measures against white supremacist propaganda. The author of the massacre at the Christchurch mosques live-streamed the atrocities on Facebook, as if it were a computer game.
There were media outlets that released a portion of these images, without filters, which again put the topic under discussion. Social networks themselves were forced to react quickly to prevent the spread of terror. The video was removed by Facebook an hour after the attack that killed 50 people and injured 50 others. This social network announced at the time that it has removed 1.5 million copies of the video, in the following 24 hours. YouTube also took stringent action to suppress the video in the hours after the massacre.
It was impossible to completely prevent the spread of the videom of course, and recently, a French Muslim advocacy group filed a lawsuit against Facebook and YouTube for allegedly not removing footage of the attack quickly enough. “Facebook must take their part of responsibility in this and must do everything to anticipate these live streams, as much as [they do with] hate messages and Islamophobia on their networks,” told Ahmet Ogras, French Council of the Muslim Faith’s president.
A similar line was taken by New Zealand’s Prime Minister, Jacinta Arden, who declared that social media outlets are “the publisher, not just the postman”.
The spreading of discrimination against the Muslim community, and, above all, the fear of further terrorist attacks committed by white supremacists, has led social networks to act quickly.
In the recent past, Facebook and Twitter fought a hard war against jihadist propaganda, particularly that spread by the Islamic State (ISIS). Attention is now turning to white nationalists, neo-Nazis, and white supremacists, who are adopting a number of the same tactics as the jihadists in the cyber realm. The populist discourse that appeals to hatred of those with different skin color and religion — mixed with fake news about political opponents — has become a growing issue in Europe and America.
In May 2018, a journalistic investigation by the online magazine Motherboard revealed that while Facebook banished “white supremacy” from its platform, it still allowed “white nationalism” and “white separatism.” The article caused much controversy among the academic community and civil rights leaders who argued that there were no differences between these ideologies.
Within ten months, Facebook and Instagram had reversed course and decided to ban the three ideologies since there was no significant difference between them. “We didn’t originally apply the same rationale to expressions of white nationalism and white separatism because we were thinking about broader concepts of nationalism and separatism — things like American pride and Basque separatism, which are an important part of people’s identity”, wrote Facebook in a blog post titled “Standing Against Hate.” After three months of consultation with “members of civil society and academics”, the company found that “white nationalism and white separatism cannot be meaningfully separated from white supremacy and organized hate groups”.
For people searching for these terms they will be directed to Life After Hate, a non-profit organization “founded by former violent extremists that provides crisis intervention, education, support groups and outreach”.
The results of this more restrictive policy began to be felt at the beginning of the year. The founder of the populist right-wing and anti-Islamic street movement, the English Defense League (EDL), Tommy Robinson, was banned from these two social networks. According to Facebook and Instagram, Robinson was in violation of their hate speech code, a verdict Twitter had rendered already. Nowadays, Robinson can only reach large audiences on the Internet through YouTube, though his presence on Facebook has not been totally eradicated. There remains, for example, footage of Robinson debating a Muslim on the BBC.
The mission against the online racial hatred gained new impetus in April with the announcement that a dozen British of far-right individuals and organizations had been banned from Facebook. According to the BBC, this blacklist includes The British National Party and its ex-leader Nick Griffin; Britain First, its leader Paul Golding and former deputy leader Jayda Fransen; the EDL and founding member Paul Ray; Knights Templar International and its promoter Jim Dowson; The National Front and its leader Tony Martin; and Jack Renshaw, a neo-Nazi who plotted to murder a Labour MP.
In Portugal, several pages of the neo-Nazi group Nova Ordem Social (NOS) were also removed from Facebook under pressure from so-called “anti-fascist” movements.
The Portuguese site Polígrafo, which had specialized in disseminating fake news — in traditional media, as well as on the internet and in the blogosphere — revealed that the extreme-right has appealed this decision, complaining of persecution and violation of freedom of expression, a right guaranteed in the Portuguese Constitution. Currently, on Facebook there is only one personal page of Mário Machado, NOS leader. And there is no sign of NOS life, at least officially.
But NOS remains very active in another important social network: Twitter. They have more than 300 tweets, against immigrants, the democratic system, and Islam. “STOP ISLAM! Before is too late!” or “The Muslims’ happiness with Notre Dame fire. The sick bastards!!”, wrote NOS in April. These tweets are still active online.
As the NOS case shows, much work remains to be done against the far-right on social media.
Still, the recent changes from Facebook seem sincere. Never have the Facebook, Instagram or Twitter CEOs been so aware of the danger posed by hate speech on their platforms — and the political price for failing to act against it. There has never been so much public and official pressure on social media outlets to suppress far-right content as there is at present, and none of them wish to be accused of complicity in the next Christchurch massacre.