Oversight Board Upholds Trump Facebook 'Suspension,' But...

Facebook sign
(Image credit: Facebook)

Facebook's outside oversight board has upheld the suspension of former president Donald Trump's Facebook and Instagram accounts, but it has told the company that it was not appropriate for the company to impose a permanent ban, which it called an "indeterminate and standard-less penalty."

Facebook took down the account Jan. 7, the day after the Capitol insurrection, about which Trump had commented on social media that it was the natural result of a stolen election.

The day before Wednesday's (May 5) decision by the board, Trump launched a new Web site, where he repeated his election fraud claims.

Also Read: Trump Launches Web Site in Advance of Facebook Decision

Following the decision, the former President took to his new Web site to hammer social media: "What Facebook, Twitter, and Google have done is a total disgrace and an embarrassment to our Country. Free Speech has been taken away from the President of the United States because the Radical Left Lunatics are afraid of the truth, but the truth will come out anyway, bigger and stronger than ever before," he wrote. "The People of our Country will not stand for it! These corrupt social media companies must pay a political price, and must never again be allowed to destroy and decimate our Electoral Process."

The board said what Facebook has to do now is review the decision "to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform," suggesting that it might be tough to uphold a permanent ban.

The board has given Facebook six months to reexamine the penalty and decide the appropriate one. In the meantime the ban (or suspension) remains in force. 

The board also provided Facebook guidance going forward on balancing the safety of its users and the First Amendment, recommending that it develop "clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.

Facebook's takedown came after Trump posted a video on the site saying: "I know your pain. I know you’re hurt. We had an election that was stolen from us. It was a landslide election, and everyone knows it, especially the other side, but you have to go home now. We have to have peace. We have to have law and order. We have to respect our great people in law and order. We don’t want anybody hurt. It’s a very tough period of time. There’s never been a time like this where such a thing happened, where they could take it away from all of us, from me, from you, from our country. This was a fraudulent election, but we can't play into the hands of these people. We have to have peace. So go home. We love you. You're very special. You've seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace."

Then followed that with a statement: "These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long. Go home with love in peace. Remember this day forever!"

Facebook referred the takedown to the board Jan. 7--the company has pledged to abide by the decisions of the outside arbiter, created in the wake of criticism of Facebook's content moderation policies.

The board concluded the above posts had violated Facebook's community standards and rules against praising violence, saying that "in maintaining an unfounded narrative of electoral fraud and persistent calls to action, Mr. Trump created an environment where a serious risk of violence was possible."

"Given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in suspending Mr. Trump’s accounts on January 6 and extending that suspension on Jan. 7," it said.

But the board said it was not appropriate to impose an indefinite suspension, saying that "it is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored."

Facebook has no guidance for such suspensions in its content policy, which does provide for a time-limited ban.

In a separate advisory, the board responded to Facebook's request on guidance going forward when it comes to the speech of "political leaders and influential figures."

The board said it was not necessarily useful to draw distinctions between such users and the rest of the Facebook community, "recognizing that other users with large audiences can also contribute to serious risks of harm."

The board did say Facebook needs to clear up "widepsread confusion" about how decisions relating to influences, political or otherwise, are made, and that newsworthiness of content should not necessarily be foremost when "urgent action" is needed to prevent "significant harm."

"Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users. These rules should ensure that when Facebook imposes a time-limited suspension on the account of an influential user to reduce the risk of significant harm, it will assess whether the risk has receded before the suspension ends," the board said. "If Facebook identifies that the user poses a serious risk of inciting imminent violence, discrimination or other lawless action at that time, another time-bound suspension should be imposed when such measures are necessary to protect public safety and proportionate to the risk.... If a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm. Suspension periods should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion."

Its other recommendations included 1) rapidly escalating decisions about political speech from influential voices to specialized staff, 2) investing sufficiently in assets and expertise to assess that risk, 3) helping users better understand the criteria for newsworthiness exceptions, reviewing Facebook's " contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6," and publishing a policy for responding to novel situations and crises when regular processes might not prevent imminent harm.

John Eggerton

Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.