Facebook releases its first transperancy report on terrorism

facebook-releases-its-first-transperancy-report-on-terrorism photo 1 Bloomberg via Getty Images

With a user base of nearly 2 billion people, Facebook's global influence is nearly unrivaled, making it a potent platform for extremist ideologies. And while the company has long sought to limit the reach of such hateful speech, it hasn't always been forthcoming in how it was going about doing so. That changed Thursday, when Facebook released the first transparency report in its Hard Questions series: "How We Counter Terrorism".

"Our stance is simple: There's no place on Facebook for terrorism," Monika Bickert, Director of Global Policy Management, and Brian Fishman, Counterterrorism Policy Manager, wrote. "We believe technology, and Facebook, can be part of the solution."

To that end, the company has leveraged a mix of artificial intelligence systems and human expertise to combat extremist threats posted on its site. AI is a fairly new addition to Facebook's arsenal but is already being used for automated image recognition, which recognizes known extremist images and prevents them from being uploaded. The company is also reportedly training a neural network to recognize and remove written text that praises or supports terrorist organizations like ISIS.

Facebook's AI is also capable enough to search through related "clusters" of posts and pages to find other offending materials as well as recognize when previously banned users attempt to create new accounts. The company hopes to expand these features to its other apps, like Instagram, in the future.

As for the company's human-based moderation, Facebook still greatly depends on the users to self-police and report each other. However they are expanding their Community Operations teams by 3,000 employees over the next year to help address reports faster. What's more, Facebook now employs a 150-member "strike team" of sorts. These specialists -- academics, former prosecutors and law enforcement -- are focused either primarily or exclusively on counterterrorism-related tasks.

Of course, Facebook isn't going it alone. The company has partnered with others in the tech industry such as Microsoft and Twitter to create a common database of "hashes" identifying terrorist material and propaganda. Facebook is also working with governments, turning over whatever information they can to law enforcement of E2E encrypted messages that pass through its network.

Recommended stories

More stories

7 Steps to Optimize Your Network for VoIP

Keeping your network in good shape can be a headache, especially after you decide to allow Voice-over-IP (VoIP) calls on your network. Here's how to prepare your network for VoIP.