Facebook’s Oversight Board’s first six cases: Here’s what they are reviewing

Facebook’s Oversight Board’s first six cases: Here’s what they are reviewing

Facebook’s Oversight Board has announced six cases it will be reviewing in order to decide whether the content removal that took place in each case was fair or not. The Oversight Board is an independent body, which was first proposed by Mark Zuckerberg in 2018 and the first members were announced in May this year. The board is independent from Facebook.

The Oversight Board “uses its independent judgment to support people’s right to free expression and ensure that those rights are being adequately respected,” notes the description.

Further all decisions which are taken, whether it is upholding or reversing Facebook’s content decisions are binding on the social network. This means that if a content was taken down by Facebook and an appeal is admitted to the board, and the board decides to reverse the decision, the social network will have to abide by this. But it adds Facebook can choose not to implement these decisions if “doing so could violate the law”.

According to the official website, “more than 20,000 cases were referred to the Oversight Board following the opening of user appeals in October 2020.” The Board is “prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”

The board is currently looking at six cases, most of them are from non-English speaking countries, and has also called for public comments around them. Four of the cases were submitted by users, while two have been submitted by Facebook. One case that was originally submitted by a user was withdrawn.

Here’s a look at the six cases that the Oversight board is reviewing

The first case with the number 2020-002-FB-UA is referred by a user and deals with “two well-known photos of a deceased child lying fully clothed on a beach at the water’s edge.” The text accompanying the photo is written in Burmese. The text asks why “there is no retaliation against China for its treatment of Uyghur Muslims, in contrast to the recent killings in France relating to cartoons.”

The content was removed under Facebook’s hate speech policy. The user has defended the post saying it “was meant to disagree with people who think that the killer is right and to emphasise that human lives matter more than religious ideologies.”

The second case with the number 2020-003-FB-UA is deals with historical photos of churches in Baku, Azerbaijan. The text asks “Baku was built by Armenians and asking where the churches have gone.”  It adds that Armenians are destroying churches. The user’s post was removed under Facebook’s hate speech policy as well. The user has stated they are just trying to highlight the destruction of “cultural and religious monuments.”

The third case with the number 2020-004-IG-UA is from a user in Brazil. It was removed under Facebook’s policies over adult nudity, which has been criticised in the past. The user had who posted pictures on Instagram with titles in Portuguese to raise awareness of signs of breast cancer. Some of the photos included pictures of female breasts with the nipple visible. Instagram typically removes photos where the female nipple is visible. The user said the post was shared as part of the national “Pink October” campaign for the prevention of breast cancer.

The fourth case with the number 2020-005-FB-UA is from a user based in the US. The user reshared a memory post, which included an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany. The quote talks about appealing to “emotions and instincts, instead of intellect, and on the unimportance of truth.” It was removed under Facebook’s policy on  “dangerous individuals and organisations.”

The fifth case with the number 2020-006-FB-FBR is referred by Facebook. This deals with a user video, which was posted on a group related to COVID-19. The video talks of an alleged scandal about the Agence Nationale de Sécurité du Médicament (the French agency responsible for regulating health products). Facebook removed the content because it was seen inciting violence. It has referred this case to show challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.

The final case referred by Facebook deals with a photo which shows “man in leather armour holding a sheathed sword in his right hand.” The photo has text written in Hindi which calls for using the sword against infidels who criticise the prophet. The post also refers to the President Emmanuel Macron of France as “the devil”. Facebook removed the content for violating its policy on Violence and Incitement. It also viewed the content as a “veiled threat” with a specific reference to an individual, which would be President Macron.

Each case is assigned to five-member panels, which includes at least one member from the region implicated in the content. The Board expects to decide on each case, and for Facebook to have acted on this decision, within 90 days. The public comment window for the cases is open for seven days.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *