Facebook and its parent company Meta flopped again in a test of how well she could detect obvious violence hate speech in advertisements submitted to the platform by the nonprofit groups Global Witness and thimble.
The hateful messages focused on Ethiopia, where internal documents held by whistleblower Frances Haugen showed Facebook’s ineffective moderation “literally fuels ethnic violence,” as she said in her testimony before the 2021 Congress.
In March, Global Witness conducted a similar test involving hate speech in Myanmar, which Facebook also failed to detect.
The group created 12 text-based ads that used dehumanizing hate speech to incite the killing of people belonging to each of Ethiopia’s three main ethnic groups – the Amhara, the Oromo and the Tigrayans.
Facebook’s systems approved the ads for publication, just like the ads in Myanmar. The ads weren’t actually published on Facebook.
Discover the stories of your interest
This time, however, the group Meta informed about the undiscovered violations. The company said the ads should not have been approved, noting the work it has done to “build our ability to intercept hateful and inflammatory content in the most commonly spoken languages, including Amharic.”
A week after hearing from Meta, Global Witness submitted two more ads for approval, again with overt hate speech. The two ads, also in written text in Amharic, the most widely spoken language in Ethiopia, were approved.
Meta did not respond to multiple messages for comment this week.
“We picked the worst cases we could think of,” said Rosie Sharpe, activist at Global Witness.
“The ones that should be easiest for Facebook to recognize. They were not coded language. They weren’t dog whistles. They were explicit statements that said that type of person is not human or that type of person should be starved to death.”
Meta has consistently refused to say how many content moderators it has in countries where English is not the primary language. This includes moderators in Ethiopia, Myanmar and other regions where material posted on the company’s platforms has been linked to real-life violence.
In November, Meta said it had removed a post by Ethiopia’s prime minister who urged citizens to rise up and “bury” rival Tigray forces threatening the country’s capital.
In the now-deleted post, Abiy said the “duty to die for Ethiopia belongs to all of us.” He urged citizens to mobilize “by holding any weapon or capacity.”
However, Abiy has continued to post on the platform where he has 4.1 million followers. The US and others have warned Ethiopia against “dehumanizing rhetoric” after the prime minister called Tigray forces a “cancer” and “weed” in comments in July 2021.
“When ads inciting the genocide in Ethiopia continue to flow through Facebook’s network – even after the issue has been reported to Facebook – there is only one possible conclusion: no one is home,” said Rosa Curling, director of Foxglove, a legal non-profit organization based in London that cooperated with Global Witness in its investigation. “Years after the Myanmar genocide, it is clear that Facebook has not learned its lesson.”