Advertisement

Op-Ed: Facebook Oversight Board’s ruling on Trump misses the big picture

President Donald Trump speaking at a podium
President Trump speaks during a rally Jan. 6, 2021, prior to the assault on the Capitol by his supporters. Facebook suspended Trump’s account indefinitely on Jan. 7.
(Evan Vucci / Associated Press )
Share

The much-anticipated decision by the Facebook Oversight Board on whether to reinstate President Trump’s Facebook account didn’t provide much satisfaction when it came out Wednesday.

The board allowed the ban to stand temporarily and gave the company six months to review and explain the arguments around the indefinite suspension of Trump’s account it imposed on Jan. 7. But even if the ruling ends up kicking the can down the road, there are some lessons to be learned .

First, the ruling confirms the intentional limits of the setup Facebook created to try to manage controversial content and bad actors. The Oversight Board, a handful of experts selected and paid by Facebook, have been given a mandate to decide on a limited set of cases, many of which are referred to the board directly by Facebook.

Advertisement

This is accountability theater and hardly a structural or scalable solution for a platform where Trump is certainly not the only spreader of disinformation and lies about issues ranging from the 2020 election to COVID-19. The board can review individual cases but lacks the authority to change broader policy at Facebook.

With all the obviously harmful posts on record, we must wonder why the decision about Trump’s account took this much time. Seeding disinformation, sewing violence, inciting hatred and spreading lies, he violated Facebook’s terms of service repeatedly for years, facing no consequences.

If dealing with one decision takes months, how will the company and this board ever deal with policing the actual volume of posts shared? Notorious users like Steve Bannon were left on Facebook, spreading the “big lie” about the 2020 election unhindered. We’ve learned since the Trump ban that Mark Zuckerberg had a different set of rules for right-wing extremists, which allowed posts praising conspiracy theorists like Alex Jones to stay on the platform.

Second, we need to pay attention to the context. Facebook’s decision to ban Trump indefinitely was informed by the concern for real world consequences of his expressions. But why didn’t the company apply the same threshold to similar activity elsewhere on its platform?

Many terrible consequences from the effects of speech on the platform have played out around the world. The company’s lack of understanding of tensions and the risk of violence in Myanmar, for instance, in 2014 and 2018 allowed Facebook to become the go-to outlet for calls for ethnic cleansing. In Cambodia, the lack of Facebook staffers in the country may explain why a governmental hate campaign against dissidents and human-rights defenders continued and sent an activist cleric into hiding.

In India, on the other hand, Facebook employees had a close working relationship with the ruling party of Narendra Modi. But now there is tension in that association as Facebook complies with the Modi regime’s efforts to limit free speech about its handling of the COVID pandemic.

Advertisement

Will the company actually choose to invest the needed resources to ensure coups and ethnic violence are not organized on its platform and executed in the streets? Will it take responsibility for security and public safety as a priority not only in the United States but globally?

Third, let’s look at what was not investigated by the Oversight Board — and, in fact, cannot be investigated by the board’s own mandate: algorithmic amplification, groups and advertisements. The real engines behind Facebook’s societal harms are totally outside the board’s scope. It is not just public posts that may contain harmful or illegal content. In fact, the insurrection of the Capital was planned in large part on closed Facebook groups, and to this day it is a place where white supremacists meet and plot.

It is a good thing that Donald Trump will not be back on Facebook to spread “the big lie” for the next six months. However, the ruling by the board does not address the fundamental need for accountability. In fact, that is not something any self-regulatory mechanism can achieve.

It is critical that rigorous oversight with truly independent assessment of compliance — built around data protection and transparency and designed with the public interest in mind — be put into place. If the Oversight Board’s ruling has done anything at all, it has reminded us of this urgent need.

Marietje Schaake is the international director of policy at Stanford’s Cyber Policy Center and a member of the Real Facebook Oversight Board, a group of independent experts focused on Facebook’s policies.

Advertisement