Facebook owner to ‘assess feasibility’ of hate speech study in Ethiopia | Facebook
Written by ABC AUDIO on January 14, 2022
The owner of Facebook and Instagram has said it will “assess the feasibility” of conducting an independent human rights study related to its work in Ethiopia, after the company’s oversight board urged it to investigate how its platforms have been used to spread hate speech and unverified rumours in the country.
Meta was asked by its oversight board, which reviews the company’s content moderation decisions and policies, to conduct the study after it upheld the removal of a Facebook post alleging the involvement of ethnic Tigrayan civilians in atrocities in Ethiopia’s Amhara region. Because the platform had subsequently reinstated the post after an appeal by the user who posted it, Facebook was required to take it down again.
As part of the ruling, the board recommended that Facebook’s parent conduct an independent human rights due diligence assessment on how Facebook and Instagram “have been used to spread hate speech and unverified rumours that heighten the risk of violence in Ethiopia”. Writing in December last year, the board said the study should cover a period from June 2020 “to the present” and should take six months to complete. While the board’s decisions on content moderation decisions are binding, its policy recommendations are not.
Responding to the board’s recommendation on Thursday, Meta said human rights diligence projects can be “highly time intensive” and run for a year or more. It added that it would consider the feasibility of such a move.
“We will continue existing human rights due diligence and dynamic risk management processes and assess the feasibility of a related due diligence project. We anticipate providing an update within the next few months,” Meta said.
Meta’s moderation efforts in non-English language speaking countries have been under scrutiny ever since it was found to have facilitated violence against the Rohingya, the Muslim minority in Myanmar. Facebook admitted in 2018 that it had not done enough to prevent the incitement of violence and hate speech against the Rohingya. An independent report commissioned by the company found that “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence”.
Frances Haugen, a former Meta employee who blew the whistle on moderation practices at the company, has warned that the social media company is fanning ethnic violence in Ethiopia. In testimony to US lawmakers, Haugen said that although only 9% of Facebook users spoke English, 87% of the platform’s misinformation spending was devoted to English speakers.
Thousands have died and millions have been displaced during a year-long conflict between the Ethiopian government and rebellious forces from the northern Tigray region.
Meta added in its statement on Thursday that it had “invested significant resources in Ethiopia to identify and remove potentially harmful content” as part of its response to the board’s recommendations in December.
An oversight board spokesperson said in a statement: “Meta’s existing policies prohibit rumours that contribute to imminent violence that cannot be debunked in a meaningful timeframe, and the board made recommendations to ensure these policies are effectively applied in conflict situations.
“Rumours alleging an ethnic group is complicit in atrocities, as found in this case, have the potential to lead to grave harm to people.”
— to www.theguardian.com
The post Facebook owner to ‘assess feasibility’ of hate speech study in Ethiopia | Facebook appeared first on Correct Success.