This year, Facebook removed 14 million terrorist-related posts: Details here
In 2018, Facebook flagged and removed as many as 14 million terror-related posts from its platform. The content was posted by groups like ISIS and al-Qaeda and more than half of it included propaganda pieces that had been posted before 2018. Most of these takedowns, as Facebook said, were driven by its automated detection and removal tools. Here's more.
Facebook's tools proactively remove terror-related posts
Facebook's tools appear to be working well in terms of taking down propaganda posts. For newer content (posted in 2018), the tools removed 1.2 million posts in Q1, 2.2 million in Q2, and 2.3 million in Q3. Even takedowns from community reports increased (16,000 in Q3), but those were much lesser than 'proactive' removals from these tools, the company said.
Takedowns of older content
As for older content, which surfaced on the platform well before 2018, the social network removed 640,000 pieces in Q1. However, the biggest change was witnessed in the next quarter when Facebook's new enforcement systems detected and removed as many as 7.1 million terror-related pieces. As a result of this work, the number of harmful pieces removed went down to 710,000 in Q3.
Facebook's detection mechanisms use Machine Learning
Facebook uses machine learning-based tools to analyze and issue scores for posts violating its counter-terrorism policies. If the tools' level of confidence on a post is high, then it is removed automatically, otherwise Facebook moderators review content according to the scores.
How long these posts stayed on the platform?
In Q1, the new posts stayed on the platform for less than a minute before being removed. However, the implementation of systems (which took time to mature) increased that figure to 14 hours before bringing it down again to 2 minutes. Old posts, on the other hand, stayed for hundreds of days, but Facebook stresses that may have remained un-viewed during that time.