The case filed in Kenya’s High Court on Wednesday by two individuals and a rights group says Meta responded inadequately to hateful content on its platform, particularly regarding the war in Ethiopia’s northern Tigray region.

A petitioner said his father, an ethnic Tigrayan, had been targeted with racist messages on Facebook before his murder in November 2021 and the social media giant had failed to move quickly to remove the posts.

“If Facebook had just stopped the spread of hate and moderated posts properly, my father would still be alive,” said Abrham Meareg, who is also Tigrayan and an academic like his father.

“I am taking Facebook to court so that no one suffers like my family again. I’m demanding justice for the millions of other Africans hurt by Facebook’s profiteering — and an apology for killing my father.”

His lawyer, Mercy Mutemi, said Facebook took a month to respond to Abraham’s appeals to have the content removed.

“Why did it take over a month to remove a post calling for someone to be killed?” she said.

Mutemi said Facebook acknowledged the content violated community standards, but a year later one of the violent posts was still online.

Another applicant is Fisseha Tekle, an Ethiopian researcher for Amnesty International and a Tigrayan who has written about the war and faced a barrage of online abuse.

The international community has condemned hate speech and dehumanizing rhetoric during the two-year conflict, which has seen all sides accused of atrocities amid warnings of ethnic cleansing.

‘Inhumane’ working conditions

Meta spokesman Ben Walters said the company had not yet been served with the lawsuit, but had “strict rules outlining what is and isn’t allowed on Facebook and Instagram.”

“We have removed misinformation when there is a risk it could contribute to long-term physical harm,” he told AFP in a statement.

“In Ethiopia, we have identified and are removing a number of harmful pre-vetted claims and out-of-context images that make false claims about the perpetrators, severity or targets of the violence in Ethiopia.”

The Katiba Institute, a Kenyan rights group and another petitioner in the lawsuit, is seeking changes to Facebook’s algorithm.

Inciting, hateful and dangerous posts “escalate conversations, attract reactions and sharing, and motivate ongoing discussions in the comments section,” said the first petition by AFP.

He also accused Meta of “inhumane” working conditions for his redundant content moderators in Nairobi tasked with overseeing eastern and southern Africa, a vast region covering 500 million people.

The petitioners claim this resulted in “systematic discrimination” against African Facebook users, citing the platform’s quick response compared to the attack on the US Capitol by supporters of former US President Donald Trump on January 6, 2021.

The petitioners are asking the court to establish a compensation fund of 200 billion Kenyan shillings ($1.6 billion) for victims of hate and violence incited on Facebook.

In late 2021, Rohingya refugees sued Facebook for $150 billion, claiming the social network failed to curb hate speech directed against them.

The Rohingya, a predominantly Muslim minority, were expelled from Myanmar in 2017 to neighboring Bangladesh by security forces in a crackdown that is now the subject of a UN genocide investigation.

AFP is involved in a partnership with Meta providing fact-checking services in Asia-Pacific, Europe, the Middle East, Latin America and Africa.

ودجت أحدث المقالات للصفحة الرئيسية تظهر على الصفحة الرئيسية فقط

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *