The Facebook Oversight Board (the board) announced in May 2021 that it upheld Facebook’s decision to temporarily suspend Trump’s account following content he posted during the US Capitol riots on 6 January 2021, on the grounds that doing so was ‘necessary and proportionate to protect the rights of others’. However, the board held it was improper to impose an ‘indefinite suspension’ as a penalty. In its 35-page decision, the board set out a comprehensive review and recommendations as to next steps, with a focus on the importance of clarity and context. Facebook was given six months to review and modify its sanction against Trump, as ‘it is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.’
Update 8 June 2021: On 4 June 2021, Facebook released an official statement entitled In Response to Oversight Board, Trump Suspended for Two Years; Will Only Be Reinstated if Conditions Permit. In the statement, Facebook confirmed new enforcement protocols to be applied ‘in exceptional cases’ such as Mr Trump’s. The time-bound penalty of two years, backdated to 7 January, is consistent with those protocols given the ‘gravity of the circumstances that led to Mr Trump’s suspension’.
This article was originally published on LexisNexis
Case decision 2021-001-FB-FBR.
What are the practical implications of this decision?
For many of its 2.7 billion users, Facebook is a key part of daily life, and content posted there can shape relationships, consumer behaviour, and political decision-making. However, the extent to which Facebook should actively moderate and regulate such content remains an open question, with considerable implications for free expression, civil discourse, and public health and safety.
In 2018, the board was established as an independent, quasi-judicial body to ensure Facebook’s rules and processes are consistent with its policies and legal obligations. And, by referring its decision to suspend President Trump to the board, we have gained a practical insight into how content moderation policies are (or perhaps, should be) enforced in practice.
As a starting point, the board carried out its analysis by reference to:
- internal policies: eg Facebook’s Community Standards and Instagram’s Community Guidelines and other terms of use
- Facebook’s corporate values: namely ‘voice’, ‘safety’, and ‘dignity’, and
- international legal standards: such as the UN Guiding Principles on Business and Human Rights and International Covenant on Civil and Political Rights
It also took into consideration a statement from Trump’s lawyers, as well as nearly 10,000 submissions from the general public, although these do not appear to have been a substantial focus.
The key question before the board was whether the company was correct to restrict Trump’s ability to post content on Facebook, on the basis that he had violated Facebook’s internal policies such as those ‘prohibiting praise or support of people engaged in violence’.
As part of its analysis, the board set out six factors which guided its assessment as to whether Trump’s speech did in fact ‘create a serious risk of inciting discrimination, violence or other lawless action’:
- context
- status of the speaker intent
- content and form extent and reach
- imminence of harm
This list, together with the board’s analysis against each point, may in future provide a useful framework for those involved in freedom of expression matters.
What was the background?
On 6 January 2021, during the formal process of certifying the results of the 2020 US presidential election, a violent mob of thousands stormed the US Capitol Building in Washington, DC. The apparent goals of many involved were to disrupt and delay the certification process, and to pressure Congress to overturn the election of Joe Biden in favour of Trump. Five people died and hundreds more were injured.
During the course of events, Trump posted twice:
- a video at 16.21 pm EST (video post), stating (inter alia):
‘I know your pain. I know you’re hurt. We had an election that was stolen from us. […] We love you. You’re very special. You’ve seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace.’
Within two hours, at 17.41 pm EST, Facebook removed the video post for violating its internal policies on praising dangerous individuals and organisations.
- a written statement at 18.07 pm (word post), stating:
‘These are the things and events that happen when a sacred landslide election victory is so unceremoniously and viciously stripped away from great patriots who have been badly and unfairly treated for so long. Go home with love in peace. Remember this day forever!’
Facebook removed the word post ten minutes later at 18.15 pm EST, and imposed a 24-hour restriction of Trump’s posting privileges on Facebook and Instagram.
The next day, and in light of Trump’s public communications beyond Facebook and additional information with respect to the Capitol riots, Facebook extended the restriction ‘indefinitely and for at least the next two weeks until the peaceful transition of power is complete’.
Joe Biden was safely inaugurated as US president on 20 January 2021. The following day, Facebook referred the matter of Trump’s account suspension to the board. Following the board’s decision on 5 May 2021, Facebook has until 5 November 2021 to re-evaluate and implement a compliant policy with regard to Trump’s suspension.
What did the Oversight Board decide?
The matter came before the board for two reasons. Firstly, Facebook sought confirmation as to whether it had on 7 January correctly suspended Trump’s posting access for an indefinite amount of time. Secondly, the company requested recommendations about suspensions when the user is a political leader.
With respect to the first question, the board upheld Facebook’s decision to suspend Trump’s access in principle, but not ‘indefinitely’. Such a penalty is neither clear nor consistent with Facebook’s rules for severe violations, which are typically either: content removal, time-bound period of suspension, or permanent account deletion.
With respect to the second matter, the board made a number of recommendations to guide Facebook’s policies in regard to serious risks of harm posed by political leaders and other influential figures. However, the board explained that it is not always useful to distinguish political leaders and other, non-elected influential users.
It may be useful to provide, by way of summary, two themes that threaded throughout the board’s decision: clarity, and context.
Clarity
In accordance with international norms and standards, the board insists that any Facebook rules which seek to curtail freedom of expression must be adhere to the principle of legality, and such rules must therefore be ‘clear and accessible’. In addition, Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users.
Context
Although the same rules should apply to all users, context invariably matters when assessing the probability and imminence of harm. In particular, the board recommends that posts be assessed with regard to social, economic and political context, and ‘according to the way they are likely to be understood, even if their incendiary message is couched in language designed to avoid responsibility.’
Edited 8 June 2021 following Facebook’s decision: The outcome of Trump’s suspension is now clear. On 4 June, Facebook’s head of Global Affairs Nick Clegg released a response to the board’s decision, and confirmed that Mr Trump would be banned from the platform until 7 January 2023, being two years from the date of the initial suspension. The two year penalty has been confirmed the highest penalty available under the new enforcement protocols. On behalf of Facebook, Mr Clegg acknowledged that the company had heeded the board’s instruction ‘to review the decision and respond in a way that is clear and proportionate’, and took the recommendations seriously.
Throughout the statement, Mr Clegg emphasised Facebook’s commitment to accountability and transparency, and explained that account holders will now be able to see if and when any of their content was removed. Importantly, users will also be able to see the reason as to why such content was moderated, and what the penalty was.
Notwithstanding the above, it is arguable that Facebook remains generally reluctant to take such an active role in moderating user-uploaded content. Worth noting in passing is of course the on-going debate as to whether social media platforms and search engines are “publishers” in a traditional sense, who should exercise some editorial discretion regarding information made available on their websites. It comes as no surprise therefore that in its statement, Facebook called for ‘thoughtful regulation’ from legislators, and explained that its internal policies were not a replacement for such legislation.
In the meanwhile, the board’s decision and Facebook’s response are likely to prove useful for academics, practitioners and Facebook users alike, when considering the regulations surrounding user-uploaded content on social media.
Case details:
- Reference: Case decision 2021-001-FB-FBR
- Body: Facebook Oversight Board
- Date of decision: 5 May 2021