Meta is facing a $2.4 billion (£1.8 billion) lawsuit over allegations that the Facebook parent company helped fuel violence in Ethiopia. This comes after a Kenyan high court ruled that the case, brought by two Ethiopian nationals, could proceed.
The lawsuit demands that Facebook modify its algorithm to prevent the spread of hate speech and incitement to violence. It also calls for more content moderators in Africa and the establishment of a $2.4 billion restitution fund for victims harmed by such content on the platform.
One of the plaintiffs, Abrham Meareg, lost his father, Prof Meareg Amare Abrha, in 2021 when his personal information and threatening messages about him were shared on Facebook during Ethiopia’s civil war. The other, Fisseha Tekle, a former Amnesty International researcher, received death threats after publishing reports on atrocities committed during the Tigray conflict.
Meta contended that the Kenyan courts lacked jurisdiction since Facebook’s content moderation for Ethiopia was based in Kenya. However, on Thursday, the Kenyan high court in Nairobi confirmed it has the authority to hear the case.
Reacting to the ruling, Abrham Meareg said, “I am thankful for the court’s decision. It’s shameful that Meta believes it shouldn’t be held accountable under Kenyan law. African lives matter.”
Tekle added that he can’t return to Ethiopia due to Meta’s failure to control harmful content on its platform. “Meta can’t undo the damage, but it can change how it manages dangerous content so others don’t suffer like I have,” he said. “I now look forward to the full court hearing.”
The case is backed by advocacy groups such as Foxglove and Amnesty International and also includes Kenya’s Katiba Institute as a claimant. The plaintiffs are demanding a public apology from Meta for Meareg’s death.
A 2022 investigation by the Bureau of Investigative Journalism and the Observer revealed that Facebook allowed hate-filled and violent content to spread during the Tigray conflict, even though it knew the risks.
Meta denied the allegations at the time, stating that it had invested in safety tools and aggressively tried to curb hate speech and misinformation in Ethiopia.
In January, Meta announced it would significantly scale back content moderation efforts, including removing fact-checkers, but promised to continue targeting illegal and extreme violations.