Facebook says it’s gotten a lot better at removing material about ISIS, al-Qaeda and similar groups

Spread the love

Facebook announced it removed more than 12.4 million pieces of content promoting or endorsing terrorists and terrorist organizations such as ISIS and al-Qaeda between April and September.

Spread the love

Facebook took down more than 12 million pieces of terrorist content on its social network between April and September, the company disclosed on Thursday. Facebook defines terrorist content as posts that praise, endorse or represent ISIS, al-Qaeda and their affiliate groups.

The removal of the terrorist content is part of an on-going effort by Facebook to rid its service of harmful content, which also includes misinformation, propaganda and spam.

Facebook said, “We measure how many pieces of content (such as posts, images, videos or comments) we took action on because they went against our standards for terrorist propaganda, specifically related to ISIS, al-Qaeda and their affiliates.”

The company said it removed 9.4 million pieces of terrorist content during the second quarter and another 3 million posts during the third quarter. By comparison, the company in May announced that it removed 1.9 million posts during the first quarter of 2018.

“Terrorists are always looking to circumvent our detection and we need to counter such attacks with improvements in technology, training, and process,” the company said in a blog post. “These technologies improve and get better over time, but during their initial implementation such improvements may not function as quickly as they will at maturity.”

A lot of the material removed was old. But Facebook said it removed 2.2 million brand new terrorist posts in the second quarter and 2.3 million in the third quarter, up from 1.2 million in the first quarter.

Facebook explained that it has focused its efforts on removing terrorist content before it is viewed by a wide audience. With that focus in mind, Facebook has reduced the median time between when a user first reports a terrorist post to when Facebook takes it down. That median time was 43 hours in the first quarter, but fell to 22 hours in the second quarter and 18 hours in the third quarter.

The company said it has relied on machine learning technology to detect terrorist content. In most cases, that terrorist content is reviewed and removed by trained humans, but the machine learning technology can remove content on its own if its “confidence level is high enough that its ‘decision’ indicates it will be more accurate than our human reviewers,” the company said.

Facebook Comments

Related Post

A porn star’s lawyer & a young Congressman want to be your nex... Jockeying among Democrats for the 2020 presidential race is well underway. Speculation is rife about the intentions of a flock of Senators (Bernie Sanders, Kamala Harris and Elizabeth Wa...
13 Surprising Facts from 100 Millionaire Interviews and What We Ca... Now that we’ve reached 100 millionaire interviews and I’ve shared their numbers, let’s dig a bit deeper into what we’ve learned so far. After all, that’s the reason I do these interviews —...
How Facebook could screw up Instagram Instagram’s founders are finally leaving Facebook, six years after they sold their startup to Mark Zuckerberg. We’ll skip the drama of their departure in this post. This is about keeping a...
PUPPET MASTER THE LITTLEST REICH: Watch The Red Band Trailer After an extremely short sprint on the festival circuit RLJ Entertainment is brining the latest chapter of the Puppet Master franchiese, The Littlest Reich, to US cinemas, VOD and Digita...

Posted by Contributor

%d bloggers like this: