Facebook defends content policy after guidelines leak

facebook-defends-content-policy-after-guidelines-leak photo 1 Reuters/Regis Duvignau

Facebook has caught its share of flak from critics of its leaked content rules. Some are worried that it's being too lenient on abusers, while others are convinced it's engaging in heavy-handed censorship. And apparently, Facebook has had enough -- it wants to explain its rationale. Global policy head Monika Bickert has published an article that defends the social network's guidelines while acknowledging the tricky nature of governing a site that serves nearly 2 billion users. The company doesn't always get things right, Bickert explains, but it believes that a middle ground between freedom and safety is ultimately the best answer.

Bickert contends that Facebook has to be "as objective as possible" in order to have consistent guidelines across every area it serves. You'll almost never see widespread legal standards, it says, and it has to balance "tensions" on delicate issues like violent imagery or nudity. What's acceptable in one region can be horribly offensive in another. If anything, the uproar from both pro- and anti-censorship camps suggests Facebook is getting things right -- it's a sign the company isn't "leaning too far in any one direction," according to Bickert.

That isn't to say that the internet giant has found a perfect balance. It's aware that it can be difficult to understand the context behind a post: is that violent video going to produce copycats, or should it stay up to raise awareness of a problem? And Bickert stresses that Facebook's guidelines aren't set in stone. It can and will change its standards as it learns new things. For example, it will leave a live suicide attempt running so that people know the streamer needs help. And Facebook will still "end up making the wrong call" at times, Bickert says.

There are definitely points of contention in the piece. The policy chief maintains that Facebook keeps some of its guidelines secret so that it doesn't prompt people to "find workarounds." That's a real concern, to be sure, but it also means that Facebook's decision-making process sometimes appears arbitrary. Also, Bickert likely won't satisfy everyone when she says that Facebook is fine with "general expressions of anger," but not specific threats. What if that generic rage amounts to a tangible threat based on where it's posted? Just because someone isn't calling you out by name doesn't make it any less scary when they talk about violence in your comments.

Still, the piece at least recognizes that this isn't a simple problem with simple answers. If it's even possible to find a harmonious set of content rules, it may take a long time to create them. And Facebook is at least doing something about its imperfections -- the tech firm is hiring 3,000 new moderators who could improve its review process. The concern is that Facebook's concept of neutrality might not hit the mark. While it has a strong incentive to play it safe and respect your freedom of expression as much as possible, it could inadvertently put some people at risk in the process.

Recommended stories

More stories