This article is about the review body established by Facebook. For other uses, see Oversight board.
Parts of this article (those related to History) need to be updated. Please help update this article to reflect recent events or newly available information.(August 2023)
Oversight Board
Purpose
"… promot[ing] free expression by making principled, independent decisions … issuing recommendations on the relevant Facebook company content policy."[1]
Meta (then Facebook) CEO Mark Zuckerberg approved the creation of the board in November 2018, shortly after a meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook.[4] Zuckerberg originally described it as a kind of "Supreme Court", given its role in settlement, negotiation, and mediation, including the power to override the company's decisions.[5]
Zuckerberg first announced the idea in November 2018, and, after a period of public consultation, the board's 20 founding members were announced in May 2020. The board officially began its work on October 22, 2020,[6] and issued its first five decisions on January 28, 2021, with four out of the five overturning Facebook's actions with respect to the matters appealed.[7] It has been subject to substantial media speculation and coverage since its announcement, and has remained so following the referral of Facebook's decision to suspend Donald Trump after the 2021 United States Capitol attack.[8]
History
Founding
In November 2018, after meeting with Harvard Law School professor Noah Feldman, who had proposed the creation of a quasi-judiciary on Facebook to oversee content moderation, CEO Mark Zuckerberg approved the creation of the board.[9][7][10] Among the board's goals were to improve the fairness of the appeals process, give oversight and accountability from an outside source, and increase transparency.[10] The board was modeled after the United States' federal judicial system, as the Oversight Board gives precedential value to previous board decisions.[11]
Between late 2017 and early 2018, Facebook had hired Brent C. Harris, who had previously worked on the National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling, and as an advisor to non-profits, to become the company's Director of Global Affairs.[12][4][13] Harris led the effort to create the board, reporting to Nick Clegg, who reported directly to Zuckerberg.[14] Harris also credited Clegg's involvement, saying that efforts to establish the board "wouldn't have moved absent Nick's sponsorship", and that it was "stalled within the company until Nick really took it on".[15]
In January 2019, Facebook received a draft charter for the board[16] and began a period of public consultations and workshops with experts, institutions, and people around the world.[17][18] In June 2019, Facebook released a 250-page report summarizing its findings and announced that they are in the process of looking for people to serve on a 40-person board (the board ended up having 20 members).[19]
In January 2020, it appointed British human rights expert and former Article 19 Executive Director Thomas Hughes as Director of Oversight Board Administration.[20] It also said that board members would be named "in the coming months".[21]
Activity
On May 6, 2020, Facebook announced the 20 members that would make up the Oversight Board.[22] Facebook's VP of Global Affairs and Communications Nick Clegg described the group as having a "wide range of views and experiences" and who collectively lived in "over 27 countries", speaking "at least 29 languages,[23] but a quarter of the group and two of the four co-chairs are from the United States, which some free speech and internet governance experts expressed concerns about.[22] In July 2020 it was announced that the board would not start work until "later in the year".[24] It starting accepting cases on October 22, 2020.[6] Members of the board have noted that it will take several years for the full impact of the board and its decisions to be understood.[7][25] The board officially began to cover cases related to Threads in May 2024.[26]
Earliest Decisions And Actions
On January 28, 2021, the board ruled on five moderation decisions made by Facebook, overturning four of them and upholding one.[27][7][28] All but one were unanimous.[8] Each ruling was decided by a majority vote of a panel of five members of the board, including at least one member from the region where the moderated post originated.[7]
Myanmar Syrian Toddler Photographs Decision
In October 2020, a Facebook user in Myanmar posted images of photographs taken by Turkish photojournalist Nilüfer Demir of the corpse of Kurdish Syrian toddler Alan Kurdi, accompanied by text in Burmese to the effect that there was "something wrong" with the psychology or the mindset of Muslims or Muslim men.[29] The text further contrasted terrorist attacks in France in response to depictions of Muhammad with an asserted relative silence by Muslims in response to the persecution of Uyghurs in China,[7][29] and asserted that this conduct had led to a loss of sympathy for those like the child in the photograph.[29]
In reviewing Facebook's decision to remove the post, the board sought a re-translation of the post,[7] and noted that the post could be read as an insult directed towards Muslims, but could also be read as commentary on a perceived inconsistency of reactions by Muslims to the events in France and China addressed.[7][29]
Azerbaijani Churches Photograph Decision
A post showing churches in Baku, Azerbaijan was captioned with a statement in Russian that "asserted that Armenians had historical ties with Baku that Azerbaijanis didn't", referring to Azerbaijanis with the ethnic slur taziks. The board found that the post was harmful to the safety and dignity of Azerbaijanis, and therefore upheld its removal.[7]
Breast Cancer Photographs Decision
In October 2020, a Brazilian woman posted a series of images on Facebook subsidiary Instagram including uncovered breasts with a visible nipple, as part of an international campaign to raise breast cancer awareness.[30][29] The photographs were asserted to show breast cancer symptoms, and indicated this in text in Portuguese, which the website's automated review system failed to understand.[7]
The images were removed and then later restored.[7][29] Facebook asked that the review be dropped as moot, but the board chose to review the action nonetheless, finding that the importance of the issue made it more beneficial for the board to render a judgment on the underlying question.[7] The board further held that removal of the post was improper, as it impacted the human rights of women, and recommended improvements to the decision-making process for the removal of such posts.[7] In particular, the board recommended that users be informed of the use of automated content review mechanisms, that Instagram community standards be revised to expressly permit images with female nipples in breast cancer awareness posts, and that Facebook should clarify that its community standards take precedence over those of Instagram.[30]
Goebbels Misattribution Decision
In October 2020, a Facebook user posted a quote incorrectly attributed to Nazi propagandist Joseph Goebbels, stating that appeals to emotion and instinct are more important than appeals to truth.[7] The post contained no images or symbols. Facebook took down the post under its policy prohibiting the promotion of dangerous individuals and organizations, including Goebbels. The account user appealed, asserting that the post was intended as a commentary on Donald Trump. The board found that the evidence supported this assertion and held that post did not indicate support for Goebbels, and ordered that it be restored, with the recommendation that Facebook should indicate to users posting about such persons that "the user must make clear that they are not praising or supporting them".[7]
French Hydroxychloroquine And Azithromycin Post Decision
In October 2020, a French user posted a French language-video in a Facebook group criticizing the Agence nationale de sécurité du médicament for its refusal to authorize hydroxychloroquine and azithromycin to treat COVID-19.[28] Facebook removed the post for spreading COVID-19 misinformation, which the board reversed, in part because the drugs mentioned are prescription drugs in France, which would require individuals seeking them to interact with a physician. The board recommended that Facebook correct such misinformation rather than removing it.[7]
On February 12, 2021, the Board overturned the removal of a Facebook forum post made in October 2020, containing an image of a TV character holding a sheathed sword, with Hindi text translated as stating "if the tongue of the kafir starts against the Prophet, then the sword should be taken out of the sheath", with hashtags equating French President Emmanuel Macron to the devil, and calling for a boycott of products from France. The board found that the post was not likely to cause harm.[31]
"Zwarte Piet" Blackface Decision
On April 13, 2021, the board upheld the removal of a Facebook post by a Dutch Facebook containing a 17-second video of a child and three adults wearing traditional Dutch "Sinterklaas" costumes, including two white adults dressed as Zwarte Piet (Black Pete), with faces painted black and wearing Afro wigs. The board found that although the cultural tradition is not intentionally racist, use of blackface is a common racist trope.[32]
On January 6, 2021, amidst an attack at the Capitol while Congress was counting the electoral votes, Trump posted a short video to social media in which he praised the rioters, despite urging them to end the violence, and reiterated his baseless claim that the 2020 presidential election was fraudulent.[35] Several platforms, including Facebook, removed it, with Facebook's vice president of integrity, Guy Rosen, explaining that the video "contributes to rather than diminishes the risk of ongoing violence".[36] That day, Facebook also blocked Trump's ability to post new content; the next day, Facebook said the block would remain at least until the end of Trump's term on January 20.[37]
On April 16, 2021, the board announced that it was delaying the decision on whether to overturn Trump's suspensions on Facebook and Instagram to sometime "in the coming weeks" in order to review the more than 9,000 public comments it had received.[38] Notably, on January 27, 2021, incoming board member Suzanne Nossel had published an op-ed in the Los Angeles Times titled "Banning Trump from Facebook may feel good. Here's why it might be wrong",[39] but a spokesperson announced that she would not participate in the deliberations over the Trump's case and would be spending the upcoming weeks in training.[40] On the same day Nossel's appointment was announced, the board also announced a new case.
On May 5, 2021, the board announced its decision to uphold Trump's account suspension, but instructed Facebook to reassess their decision to indefinitely ban Trump within six months.[41] The board specified that Facebook's standard procedures involve either a timed ban or a complete removal of the offending account, stating that Facebook must follow a "clear, published procedure" in the matter.[42]
On June 4, 2021, Facebook announced that it had changed the indefinite ban to a two-year suspension, ending on January 7, 2023.[43] Trump's Facebook account was later reinstated in March 2023, with Meta saying the public should be allowed to hear from politicians, but that Trump would be subject to "heightened penalties" for repeated violations of its rules.[44]
XCheck Program
In September 2021, the board announced it would review Facebook's internal XCheck system, which fully exempted high-profile users from some of the platform's rules and regulations as well as partially exempting less high-profile users with their posts subjected only to Facebook's content review. This program was a separate system and queue, intended only for around 5.8 million users.[45] The board's quarterly report, issued on October 21, 2021, stated that the company was not transparent about the XCheck program and did not provide the board with complete information upon which to conduct a review.[46] The board also noted that the company's lack of transparency with users about reasons for content deletion was unfair.[47] In response, the company stated that it would aim for greater clarity in the future.[47]
Meeting with Frances Haugen
In October 2021, the board announced that it would be meeting with former Facebook employee and whistleblower, Frances Haugen, to discuss her statements about the company that she previously shared with The Wall Street Journal and United States Senate Commerce Committee's Sub-Committee on Consumer Protection, Product Safety, and Data Security.[48][49]
In order to ensure the board's independence, Facebook established an irrevocable trust with $130 million in initial funding, expected to cover operational costs for over half a decade.[51][52] The board is able to hear appeals submitted by both Facebook and its users, and Facebook "will be required to respond publicly to any recommendations".[51] Notably, while the initial remit of the board gave it broad scope to hear anything that can be appealed on Facebook, the company stated that it would take the building of technical infrastructure in order for this to extend beyond the appeal of removals of content.[53][54] The entire Oversight Board is overseen by the Oversight Board Trust, which has the power to confirm or remove new board appointees, as well as ensure that the board is operating in accordance with its stated purpose.[51][52]
In legal terms, the Oversight Board actually is incorporated as a Delaware-based LLC, with the Oversight Board Trust as its only member.[50]
Board members indicated that the board would begin its work slowly and deliberately, with a focus on producing meaningful opinions in cases carefully selected to be representative of substantial issues.[55] Facebook also developed software to enable it to transfer cases to the board without compromising user privacy.[55] On April 13, 2021, the Oversight Board announced that it would start accepting appeals by users seeking to take down other people's content that had not been removed following an objection.[56]
Members
The charter provides for future candidates to be nominated for board membership, through a recommendations portal operated by the U.S. law firm Baker McKenzie.[57]
On April 20, 2021, its newest board member, PEN America CEO Suzanne Nossel, was appointed to replace Pamela S. Karlan, who had resigned in February 2021 to join the Biden administration.[40] As of 2021[update], the United States has the most substantial representation with five members, including two of the four co-chairs of the board. Two board members come from South American countries, six come from countries all across Asia, three come from Africa including one with both African and European ties, who also counts towards three coming from Europe, and one comes from Australia.
Facebook's introduction of the Oversight Board elicited a variety of responses, with St. John's University law professor Kate Klonick describing its creation as an historic endeavor,[64] and technology news website The Verge deeming it "a wild new experiment in platform governance".[55]Politico described it as "an unapologetically globalist mix of academic experts, journalists and political figures".[15]
Even before the board made its first decisions, critics speculated that the board would be too strict, too lenient, or otherwise ineffective. In May 2020, Republican Senator Josh Hawley described the board as a "special censorship committee".[65] Other critics expressed doubts that it would be effective, leading to the creation of an unrelated and unaffiliated group of "vocal Facebook critics" calling itself the "Real Facebook Oversight Board".[55] Facebook issued no official comment on the effort, while Slate described it as "a citizen campaign against the board".[7]
Legal affairs blogger Evelyn Douek noted that the board's initial decisions "strike at matters fundamental to the way Facebook designs its content moderation system and clearly signal that the FOB does not intend to play mere occasional pitstop on Facebook's journey to connect the world".[65]