Facebook wants to solve its censorship problems and build more transparency with its Content Oversight Board. But the attempt in itself could cause more problems than Facebook already has.
The company released a report June 27 detailing the suggested approaches to constructing the board. While it admitted that a project of this size and scope would take years to figure out, the mission for the board seemed predetermined.
Facebook asked the opinions of more than 1,200 people from 88 different countries. These people included “academics at the top of their field, grassroots organizers committed to change, and everyday people.” However, some of the people anonymously told Facebook through a poll that they were there on behalf of their country’s government. 5.41% of those polled were there representing various governments but it was impossible to tell which from the report.
The 88 countries included the United Arab Emirates, and Singapore, both of which have troubling laws against freedom of speech. In countries like New Zealand, users could go — and have gone — to jail for content they have posted. Facebook made it clear that content “legally prohibited by countries” could not be discussed by the board.
[ads:im:1]
Free speech standards vary wildly from country to country and might clash with one another. The United States Constitution has uniquely strong protections for speech and press that other nations do not.
While Facebook wanted to build the board keeping in mind “geographic and cultural balance as well as a diversity of backgrounds and perspectives,” the standard for freedom of speech stems from those U.S. principles. Concern was expressed that Facebook might be biased in favor of “The Global North” (countries in North America and Europe) for rulings on human rights. Facebook promised to “provide clarity” to ensure there was no bias, which is even more concerning.
Despite the obvious differences in how each country views human rights and freedom of speech, the board is apparently intended to rule on the community as a whole, not “specific constituencies.” However, building a government to rule a massive structure with a population larger than the two most populated countries combined is not exactly feasible, given the diversity and cultural differences that even the platform is forced to acknowledge. The language barriers alone make that attempt difficult.
The Board was not “designed to hear decisions about News Feed ranking or artificial intelligence,” according to the report. Any sort of shadowbanning, content suppression, or bias in the ranking would not be discussed. Similarly, algorithmic flaws or designs would also be left up to Facebook’s engineers. The Board would only look at “content governance.”
However, Facebook made it clear that even then, the board could not overrule Facebook. “We actually cannot confer on the board greater authority than Facebook itself has,” the report stated. This concerned the workshop participants, who wondered if Facebook would use the board as a kind of scapegoat to avoid any third party oversight or regulation.
Those polled were concerned about Facebook selecting members of the first board. However, Facebook seemed to want to resolve this by simply saying that it would select the first 40 members with a careful eye to “diversity.”
Despite the claim to enhance transparency, Facebook suggested that the members of the Board be kept anonymous to the public. The report stated, “Facebook indicated that panels will issue their decisions without attribution. Trade-offs would be required to balance transparency, security, and privacy; feedback in this regard was split.”
But if the company was truly striving to be transparent, it would make those members known to the public.
[ads:im:2]