Meta, the parent company of Facebook, is under scrutiny for its handling of threats made against content moderators by Ethiopian rebel forces. According to court documents filed in Kenya, a contractor hired by Meta disregarded serious safety concerns raised by moderators targeted by the Oromo Liberation Army (OLA) for removing graphic posts from Facebook.
The issue surfaced as part of a larger case involving 185 content moderators who are suing Meta and two contracting firms, Sama and Majorel. These moderators allege they were dismissed from their roles with Sama, a Kenya-based company tasked with moderating Facebook content, after attempting to unionize. Furthermore, they claim they were blacklisted from similar positions at Majorel, the contractor that replaced Sama.
Among the moderators, those reviewing content from Ethiopia reported receiving threats directly from the OLA. The group reportedly warned them to stop taking down their posts or face "dire consequences." One moderator, in an affidavit, shared that he received messages listing his and his colleagues' names and addresses. Fearful for their lives, some moderators refrained from visiting family in Ethiopia. Despite the alarming nature of these threats, Sama initially dismissed them as fabricated. Only after persistent complaints did the firm conduct an investigation, eventually relocating one moderator to a safe house.
When approached, Sama declined to comment on the allegations, while Meta and the OLA did not respond to inquiries.
The Oromo Liberation Army is a rebel faction with deep-seated grievances over the perceived marginalization of Ethiopia's Oromo community. The group has been accused of civilian attacks following the collapse of peace talks in 2023 aimed at ending a decades-long conflict in Ethiopia.
Adding to Meta's challenges, the court documents revealed that the company ignored expert recommendations on combating hate speech in Ethiopia. A former supervisor of moderators testified about the relentless cycle of reviewing hateful content, much of which couldn't be removed due to Meta's policies. This inaction, the supervisor noted, contributed to an "endless loop of hateful content."
The lawsuit, which began as an attempt to seek better working conditions and protections for moderators, now highlights broader issues with Meta's approach to content moderation. A previous 2022 case accused Meta of allowing violent and inflammatory posts from Ethiopia to remain on Facebook, exacerbating tensions during the civil war between the federal government and Tigrayan forces.
Efforts to settle the current case out of court failed in October 2023. The outcome could set a precedent for how tech giants like Meta collaborate with content moderators worldwide.