Facebook has extended its ban for Donald Trump until January 2023 after deliberation by its oversight board.
Facebook announced today that former president Donald Trump will be suspended for two years after his Facebook and Instagram accounts made posts that praised the violent insurrection at the US Capitol that sought to overturn his loss in the presidential election.
“Given the gravity of the circumstances that led to Mr. Trump’s suspension, we believe his actions constituted a severe violation of our rules which merit the highest penalty available under the new enforcement protocols,” the company said in the announcement.
Nick Clegg, Facebook's vice president of global affairs, said when the ban expires on Jan. 7, 2023, the company will “look to experts to assess whether the risk to public safety has receded,” which means that civil unrest or violence would be considered in allowing Trump back on Facebook.
The former president called the two-year ban an “insult” in a statement, adding, “They shouldn’t be allowed to get away with this censoring and silencing.”
Facebook did not immediately respond to a request for comment.
After two years, if Facebook thinks there's a "serious risk to public safety" by letting Trump back on, the restriction will be renewed for another period of time, after which it would again be reevaluated.
When — or if — Trump is allowed back on Facebook and Instagram, any infractions against the platform’s community standards would result in “a set of rapidly escalating sanctions” that could include the “permanent removal of his pages and accounts,” according to the post.
Facebook also admitted fault in how it handled Trump's posts in January 2020. It noted that the social network “did not have enforcement protocols in place adequate to respond to such unusual events,” but said that they were in place now.
On May 5, the Oversight Board, a Facebook-administered advisory committee, upheld the suspension but asked the company to revisit the penalty. The ruling included 19 recommendations, and Facebook said that it would fully implement 15 of them. These included a mandate to "act quickly on posts made by influential users that pose a high probability of imminent harm" and consider the context when making assessments about harm.