In its initial decisions, the Facebook Oversight Board overturned most of the company’s actions related to user submissions that were previously removed from the platform for violating social networking standards. 4 of the 5 cases announced on Thursday involved ethnic hate speech, nudity andMisinformation – The board has decided to restore the posts.
The board’s decisions are final and Facebook has seven days to restore content according to the board’s judgments. The company has 30 days to respond to policy recommendations made by the Board of Directors.
Facebook formed the independent body, sometimes referred to as the company’s “Supreme Court,” in 2019 to make its content moderation guidelines more transparent. In addition to issuing a number of judgments, the Oversight Board made nine policy recommendations to Facebook and found that more than 150,000 cases were challenged by users.
The board’s initial decisions did not include a highly anticipated verdict on Facebook’s decision to suspend President Donald Trump’s account after Jan. 6, which Facebook last week to the board. The board said it would open Trump’s case for public comment tomorrow.
The first judgments gave an insight into the decision-making process of the 20-member panel, which is composed of legal experts, journalists and human rights lawyers. The published decisions contained frequent references to international human rights standards on free speech and suggested that board members prefer freedom of expression except in cases that could cause harm.
One of the five cases Facebook will use as a precedent for deciding on similar cases was the decision to remove a post that pejoratively implies that Muslims are inferior, a contribution to breast cancer education depicting female nipples, one post allegedly quoting a German Nazi leader, and there is one post that falsely claimed a cure for COVID-19.
Facebook’s Vice President for Content Policy, Monika Bickert, said the company would “take the board’s suggestions to heart”. “Your recommendations will have a lasting impact on the structure of our policy,” she said.
The board only confirmed one of Facebook’s decisions to remove a Russian-language post that used an ethnic bow against Azerbaijanis.
The board’s first decision involved a user in Myanmar, who posted in Burmese, questioning the lack of response by Muslims to the treatment of Uighur Muslims in China. The Post implied that something was wrong with Muslim men. After Facebook removed the post, the Oversight Board found that Facebook’s original translation may have been inaccurate and ruled that while the statement was derogatory towards Muslims, it did not result in hate speech.
The board upheld Facebook’s decision to remove a post that used a Russian-language pun in its second decision where what Facebook said could be interpreted as an ethnic arc. After commissioning an independent linguistic analysis, the Oversight Board found that the word was indeed a dehumanizing label for Azerbaijanis.
The board wrote: “Given the dehumanization of the bow and the risk that such blurring could escalate into physical violence, in this case Facebook was allowed to prioritize the” safety “and” dignity “of people over the” voice “of the user. “
A third decision restored an Instagram post that had been removed by an automated system for violating the company’s nudity standards. A user post in Brazil aimed at raising awareness of breast cancer symptoms included photos of female breasts and nipples showing signs of cancer. The Oversight Board wrote that after selecting this case, Facebook restored the post and labeled the removal a technical error. It then asked the Board not to hear this case.
The board disagreed and wrote that the case was important. “The incorrect removal of this post indicates a lack of adequate human oversight, leading to human rights concerns,” the board wrote. The organization asked Facebook to change its policy to notify users when their content is being moderated by automated systems and to allow users to direct certain moderation decisions to people.
The board decided to reinstate a post that in its fourth decision incorrectly attributed a quote to the Nazi German leader Joseph Goebbels. The user shared the quote out of context, but later told the board that the intent was to condemn Goebbels and make a comparison between the sentiment in the quote and Trump’s presidency.
Facebook’s policy is to treat quotes attributed to dangerous people as expressions of support unless the user adds context to indicate that they are condemning that person. However, the Board noted that these guidelines were not set out clearly to the public, and that user was not told which guidelines his posting violated. The board advised Facebook to better educate users about its rules and to clarify which organizations and individuals can be classified as “dangerous”.
The board’s final decision restored a post criticizing France’s health strategy, falsely claiming that there is a cure for COVID-19. The post criticized a French regulator for refusing to approve hydroxychloroquine for use against COVID-19 – a drugbut not proven to prevent COVID-19. The board ruled that while the post made false claims about a COVID cure, it should be restored as it did no immediate harm, a crucial element of Facebook’s misinformation policy.
The board also said Facebook’s misinformation rules were inconsistent and unclear. “A patchwork of guidelines on various parts of the Facebook website makes it difficult for users to understand what content is banned,” the board wrote. “Changes to Facebook’s COVID-19 guidelines announced in the company’s newsroom have not always been reflected in the company’s community standards, although some of those changes appear to contradict them.”
Bickert said the contents of three cases had already been restored and the contents of the fourth case were restored last year.