Facebook says it’s gotten a lot better at removing material about ISIS, al-Qaeda and similar groups

Spread the love

Facebook announced it removed more than 12.4 million pieces of content promoting or endorsing terrorists and terrorist organizations such as ISIS and al-Qaeda between April and September.

Spread the love
Spread the love

Facebook took down more than 12 million pieces of terrorist content on its social network between April and September, the company disclosed on Thursday. Facebook defines terrorist content as posts that praise, endorse or represent ISIS, al-Qaeda and their affiliate groups.

The removal of the terrorist content is part of an on-going effort by Facebook to rid its service of harmful content, which also includes misinformation, propaganda and spam.

Facebook said, “We measure how many pieces of content (such as posts, images, videos or comments) we took action on because they went against our standards for terrorist propaganda, specifically related to ISIS, al-Qaeda and their affiliates.”

The company said it removed 9.4 million pieces of terrorist content during the second quarter and another 3 million posts during the third quarter. By comparison, the company in May announced that it removed 1.9 million posts during the first quarter of 2018.

“Terrorists are always looking to circumvent our detection and we need to counter such attacks with improvements in technology, training, and process,” the company said in a blog post. “These technologies improve and get better over time, but during their initial implementation such improvements may not function as quickly as they will at maturity.”

A lot of the material removed was old. But Facebook said it removed 2.2 million brand new terrorist posts in the second quarter and 2.3 million in the third quarter, up from 1.2 million in the first quarter.

Facebook explained that it has focused its efforts on removing terrorist content before it is viewed by a wide audience. With that focus in mind, Facebook has reduced the median time between when a user first reports a terrorist post to when Facebook takes it down. That median time was 43 hours in the first quarter, but fell to 22 hours in the second quarter and 18 hours in the third quarter.

The company said it has relied on machine learning technology to detect terrorist content. In most cases, that terrorist content is reviewed and removed by trained humans, but the machine learning technology can remove content on its own if its “confidence level is high enough that its ‘decision’ indicates it will be more accurate than our human reviewers,” the company said.

Facebook Comments

Related Post

Elon Musk reaches deal over tweets about taking Tesla private Media playback is unsupported on your device Media captionWho is Elon Musk?Elon Musk must step down as Tesla chair and pay a fine after reaching a deal with US regulators over tweets ...
Polarized politics in the Scrooge economy PERHAPS THE most often-used words to describe U.S. politics in the Trump era are “polarization” and “volatility.” Both were on display during the November 6 midterm election: The majority ...
The untold truth of Magic Eraser Do you remember how people said your dog would die if you tried to get rid of the doggy stink on his bed with a few squirts of Febreeze, or how your cat would also die if you cleaned your ...
IRS Advisors Call for More Tax Guidance on Crypto Transactions An advisory committee to the U.S. Internal Revenue Service (IRS) believes the agency should provide clearer guidelines on how cryptocurrency transactions may be taxed. In a new report publ...

Spread the love

Posted by Contributor

%d bloggers like this: