A muddle of decisions indicates that Facebook and its ‘Supreme Court’ are making little progress towards a consistent policy on human rights

Facebook's 'Trump ban' received breathless coverage earlier this year, when the Facebook Oversight Board (FOB) - the supposedly independent entity established by the tech giant to adjudicate its content decisions - 'ruled' that the former president's ban from the social media site should be upheld.

Experts from WIRED's Gilad Edelman to scholar Kate Klonick have used the Trump decision to argue that the FOB is working.

But buried amidst that coverage - and that surrounding Facebook's botched handling of COVID disinformation - are a muddle of decisions indicating that Facebook and its Oversight Board are making little progress towards a consistent policy on human rights. In fact, they may be headed towards a showdown.

To understand the frailty of the FOB's decision-making, we need to look beyond the US. Since its inception in October 2020, the FOB has sought to apply international human rights standards to its decisions on Facebook's content policies, from the Trump decision to its more recent ruling on freedom of expression in Russia.

While the head of the FOB has promised to hold Facebook to such standards, the company itself often favors local laws over international norms. In the recent case of bullying in Russia, "[we] found that while the removal was in line with the Bullying and Harassment Community Standard," wrote the FOB in a recent ruling, "[Facebook's rules] are an unnecessary and disproportionate restriction on free expression under international human rights standards".

The debate is playing out in Myanmar, where Facebook's algorithms promoted pro-military propaganda even after it banned accounts linked to the military. It has played out in India, and in Israel and Palestine, where Facebook is constantly recalibrating if it will abide by the laws of a nation-state or follow, as the Oversight Board says it will, international rights law. In these cases, Facebook has lurched uncomfortably from one extreme to the other, alternately censoring user posts at the behest of governments, then sometimes back again.

These are consequential decisions that beg answers to the questions: does Facebook follow international rights law, or the whims of angry rulers? What national laws does Facebook follow? And will Facebook listen to its own Oversight Board on human rights?

Toothless watchdog?

openDemocracy understands that the Oversight Board is not consulted ahead of any initial decisions regarding Facebook content or policies - and is instead viewed by the social media giant as a user grievance mechanism. What's more, the board's recommendations to Facebook are also non-binding, leaving the company itself to decide whether to implement them.

The charter of the FOB grants powers to the board to request Facebook provides information for its inquiries, but places no corresponding obligation on the company to comply with such requests. This carefully crafted constitution helps Facebook avoid serious scrutiny into its methods, as evidenced in the Trump decision where Facebook declined to answer seven of the board's 46 questions because, as stated by the company:

The information was not reasonably required for decision-making in accordance with the intent of the charter; was not technically feasible to provide; was covered by attorney/client privilege; and/or could not or should not be provided because of legal privacy, safety or data protection concerns.

The seven questions concern how Facebook's features, such as its news feed, control visibility of Trump's content and whether Facebook has researched or plans to make changes, like reversing design decisions relating to the insurrection event of 6 January 2021. While Facebook did not respond to questions about 6 January so these answers are unknown, it is clear that Facebook's decision to comply with information requests should not be up to the company alone.

To that end, Facebook notes that it has been a member of the Global Network Initiative since 2013, and has clear and documented procedures for its handling of government-takedown and data requests. It has committed to being independently assessed on its implementation every two years.

Limits of oversight

Alarmingly, when faced with some of its most consequential decisions of the past six months, whether to accede to government demands in Myanmar, India, and Palestine, Facebook did not consult its 'Supreme Court'. These monumental content questions aren't on the docket, illustrating just how limited the Oversight Board's mandate is.

The FOB did have an impact on Facebook's recent decision to remove its political leader exemption - which means content posted by world leaders will now be subject to the same regulations as that posted by anybody else - which was one of the board's recommendations in the Trump ruling. This policy idea was not new, however, and surely didn't require an oversight board - smart academics have been suggesting this shift for years. And implementation is already lacking. Facebook has pledged only to label dangerous content by political leaders, not to remove it.

Trump may be banned for now, but if a leader today encouraged insurrection would it be removed, or just flagged? Would, for example, an inflammatory post by Brazil's Jair Bolsonaro or the Philippines' Rodrigo Duterte be removed? Nobody knows.

Ultimately that's the question: If a government asked Facebook to remove a political activist or dissident, would it do so? If you follow international human rights law, the answer would obviously be no. But Facebook's track record indicates its decisions are not always so clear cut. And in the most recent case in Myanmar, Facebook's human deciders were overruled by its algorithm, which took the side of the brutal Tatmadaw armed guard.

Conflict of interests

The FOB has been compared by some to an International Tribunal or quasi-judicial institution, but it's worth noting that the Oversight Board is also funded (via an independent trust) by Facebook - surely a conflict of interests.

To protect democracy globally from the impacts of social media, it would seem what's needed is an unwavering body, not one that is compromised or made of half-measures.

With the Oversight Board's co-chair on record arguing that human rights should drive decision-making, this could be the issue that defines the board - not the Trump ruling. Whether this comes down to one significant case or an ongoing body of decisions by the Oversight Board, international human rights remains an area that continues to plague Facebook, and where oversight is not driving any clear policy. The case is not yet before them, but the day will come.

An ironclad policy choosing rights over rulers would be a real sign of progress.

From openDemocracy

Leave a Comment

Recent Posts