Facebook removed or added warnings to 1.9 million pieces of content related to terror groups Islamic State and al-Qaeda in the first quarter of 2018, close to double the number it removed the previous quarter, it said in a statement released overnight.
The number is likely lower than the total volume of content removed, as Facebook also removes profiles, pages and groups that violate community standards and does not go through that content to label it as terrorism-related after it has been taken down, it said.
The social media platform has beefed up its counterterrorism team to 200 people from 150 last June and built specialized techniques to find and remove old content, according to its statement. In 99 percent of cases, the content taken down was detected by internal reviewers, not reported by users.
“Were under no illusion that the job is done or that the progress we have made is enough,” the statement said. “Terrorist groups are always trying to circumvent our systems, so we must constantly improve.”
YouTube also announced overnight that it had removed over 8 million “violative” videos between October and December 2017 as part of an effort to better enforce its community guidelines.
Most videos were spam or included adult content, and a majority — 6.7 million — “were first flagged for review by machines” rather than humans, using technology introduced in June 2017, it said in an official blog post.
Among those flagged by machines, 76 percent were removed before they were viewed. More than half the videos removed for violent extremism had “fewer than 10 views,” it said.
[contf] [contfnew]