A new lawsuit has accused Facebook’s parent company Meta of amplifying hate speech in Ethiopia, one of the many non-English-speaking markets where the social media giant’s failure and unwillingness to curb online hate have repeatedly resulted in episodes of widespread violence.
The class-action lawsuit, filed in Kenya on Tuesday and seeking about $2 billion in restitution, accuses Meta of discriminating against African Facebook users. It has been brought by two Ethiopian researchers, Abrham Meareg and Fisseha Tekle, and Kenyan human rights group Katiba Institute. UK-based nonprofit group Foxglove, which has, this year, run several tests to expose Meta’s inadequate moderation resources, is supporting the lawsuit.
“However bad you and I think content moderation is in the US, it is an order of magnitude worse anywhere outside of the US, and particularly bad in places facing crisis or conflict,” said Cori Crider, director at Foxglove. “When people make posts calling for genocide or targeting people in certain areas, posts will go viral and it will not come down. What happened to Abrham’s father is horrific and also systemic.”
A content moderation office was opened in Nairobi by Meta in 2019, but the situation for hate speech and incitement to violence across Meta platforms has seen no significant improvement in the conflict-ravaged country.
The case alleges Facebook’s algorithms amplified violent posts in Ethiopia, several of which surfaced prior to the murder of the father of Meareg. The researcher claims incitement on Facebook played a major role in the killing of his father last year. Meta has been accused of failing to invest adequate resources in its content moderation, which is leading to the violation of human rights in Africa, and monetising inflammatory posts as negative content attracts a higher user engagement.
Additionally, the lawsuit highlights the company’s discriminatory content moderation mechanisms for the US and Africa, echoing the widely reported on revelations that 75 per cent of Meta’s total budget allocated for content moderation resources is focused on the US alone.
According to the lawsuit, researcher Meareg, who is currently in the United States, reported Facebook posts that called for violence against his father, but the platform failed to remove the harmful material. Meareg’s father was targeted with online hate because his family belongs to the ethnic group Tigrayan that faced an onslaught of severe violence following the two-year civil war that flared up in 2020, the lawsuit adds.
“Feedback from local civil society organizations and international institutions guides our safety and integrity work in Ethiopia,” Meta said in a statement. “We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya.”
Despite calling Ethiopia one of its highest priorities, Meta has failed to curb hate and incitement to violence on its platforms in the country. The company’s failure is evident in developing markets, especially in countries that are already grappling with political and ethnic volatility. Meta’s harmful business practices have been scrutinised time and again by human and digital rights experts, who question the company’s failure or unwillingness to invest in adequate content moderation despite lofty revenues and every available resource at its disposal.




