Advertisement

Facebook drafts charter for content oversight board, but questions remain

It would be independent and impose firm limits.

Mark Zuckerberg talked about creating an independent body that would oversee Facebook's content decisions, and now that plan is taking shape. The social site has published a draft charter for an oversight board that would handle appeals for content decisions. There are still many unknowns, but it gives you an idea of the focus and scale for the new organization when it's ready.

At present, Facebook is looking at a board of up to 40 people from various "backgrounds and perspectives," initially chosen by a Facebook-commissioned team but later chosen by the board itself. Members would serve part-time in fixed three-year terms with only one chance for automatic renewal. Users and Facebook itself would refer disagreements to the board, with cases chosen by a "rotating set" of panels involving odd numbers of board members. All decisions would be made public, with a formal explanation required within two weeks -- like a Supreme Court decision, any dissenters could provide their opinion.

The company also vows to keep the decisions as neutral as possible. The board won't allow Facebook employees (current or former), and will pay people a "standardized" compensation. They can't take lobbying or "other incentives" to skew a decision, Facebook said. And while Facebook doesn't believe the board can represent absolutely every country or culture, it would have the power to consult outside experts to make an informed decision.

There's still a long way to go before the board is ready. Facebook hopes to refine the charter through a series of international workshops with experts in democracy, fairness, free speech and human rights. It's also offering a proposal process for other experts. After that, Facebook will review all the input and formally establish the board.

Whenever it does come into being, the oversight board could be crucial to Facebook's future. The firm has acknowledged that it has a tough time filtering content, and that it sometimes gets decisions wrong -- just ask some breast cancer awareness campaigns about its nudity policies. In theory, this could correct mistakes (and potentially shape policy) without requiring special effort on Facebook's part each and every time. Whether or not that works is another story. There's a chance the board could face a deluge of appeals, and an independent board doesn't guarantee ideal results. This is just a draft, however it won't be surprising if the completed board handles disputes in a different way.