Social Media Platforms Failed to Take Down 90% Of Reported Islamophobic Content, Report Says

A new report recently revealed that social media platforms such as Facebook, Twitter, Instagram and YouTube have failed to take action against approximately 90 per cent of reported Islamophobic content.

The report, which was published by the Centre for Countering Digital Hate, came after 530 posts were reported by the non-profit organisation; 125 posts on Facebook, with only seven being acted on, 227 on Instagram, with only 32 acted on, 50 on TikTok with 18 acted on, 105 on Twitter with only three acted on and 23 videos submitted to YouTube, none of which were taken down.

The posts featured offensive and dehumanising content including racist caricatures, conspiracies, and false claims. According to The Independent, this included “Instagram posts that depicted Muslims as pigs and called for their expulsion from Europe, comparisons between Islam and cancer that should be “treated with radiation” on a photo of an atomic blast, tweets on Twitter that claimed Muslim migration was part of a plot to change the politics of other countries, and many more.”

Some of these posts were also accompanied by vile hashtags such as #deathtoislam, #islamiscancer and #raghead, which the CCDH initially used to identify the collection of posts.

Credit: Twitter / @CCDH

The news comes after social media platforms vowed to crack down on Islamophobic content. However, we have yet to see any significant improvement across the board, especially on Facebook as things seem to be particularly unacceptable there.

The platform plays host to multiple racist/Islamophobic communities including groups such as “ISLAM means Terrorism”, “Stop Islamization of America”, and “Boycott Halal Certification in Australia”, which all feature over 310,000+ members in total.

Not only that but after further investigation, researchers also noticed that there was a link between Facebook and the Christchurch terrorist, who was known for his 74-page anti-Islam manifesto. Twenty posts featuring the mass murderer were reported and only 6 were removed.

Credit: Unsplash

Back in 2019, the shooter used Facebook to Livestream the killing of 51 Muslims in two separate mosques and even despite the platform allegedly removing 1.5 million reposts of the harrowing video, other variations of the clips are still floating around. It seems as though the video has been difficult to remove due to the constant reposting of it, which should be a particular cause for concern if people feel the need to share graphic evidence of Muslims being murdered.

Speaking of the report, Kemi Badenoch, the minister for communities and equalities, said: “We welcome this report, which shines an important light on the unacceptable abuse many Muslims receive online every day. Social media companies have to do more to take meaningful action against all forms of hatred and abuse their users experience online.”