Facebook content moderators in Kenya are suing the social media site’s parent company Meta and two outsourcing companies for unlawful redundancy.
The 43 applicants say they lost their jobs with Sama, a Kenya-based firm contracted to moderate Facebook content, for organising a union.
They also say they were blacklisted from applying for the same roles at another outsourcing firm, Majorel, after Facebook switched contractors.
Meta, which also owns Instagram and WhatsApp, is seeking to slash its workforce by around a quarter within six months as the tech sector endures bumpy times with Facebook, which became Meta in late 2021, notably battling a slowdown in online advertising.
“In January, 260 content moderators working at Facebook’s moderation hub in Nairobi, Kenya, were told that they would be made redundant by Sama, the outsourcing firm that has run the office since 2019,” the statement read.
“Overnight, these moderators doing critical safety work for East and South Africa lost their jobs.”
“In the biggest legal challenge yet to Meta’s African operations, 43 moderators at Facebook’s Nairobi moderation hub are suing the social media firm and its outsourcers for sacking the entire workforce — and for blacklisting all the laid-off workers.”
In a lawsuit, filed by content moderators engaged through third-party company Sama through their lawyer, Ms Mercy Mutemi, accuses Meta and its partners of unlawful termination, discrimination and violation of their rights.
The court petition also reveals that Sama began letting go of people for unjustifiable reasons and no replacements were hired despite the acknowledged need for more content moderators.
Last December, a Kenyan NGO and two Ethiopian citizens filed a lawsuit in Kenya against Meta, accusing the platform of not doing enough to combat online hate speech, and urged the creation of a $1.6 billion fund to compensate victims.
The action claimed that Meta promoted speech that led to ethnic violence and killings in Ethiopia by using an algorithm liable to prioritise or recommend hateful and violent content on Facebook.
Content moderation often involves filtering through the darkest corners of the internet, sifting through a relentless barrage of disturbing material such as extreme violence, child pornography and terrorist propaganda.
Examples of the posts that they view on a daily basis include pictures and videos of people being raped, children being molested and people being slaughtered or burnt alive. Some of them even saw their own relatives die on the platform.
According to the court petition, the constant exposure to toxic content has left many moderators struggling with mental health conditions such as Post-Traumatic Stress Disorder, depression, and anxiety.
Others report insomnia, graphic nightmares, hallucinations and suicidal thoughts.
Meta, renowned as one of the most lucrative tech giants globally, is known for providing its direct employees with generous six-figure salaries.
However, moderators engaged through Sama earn approximately $2.20 (about Sh300) per hour, significantly less than their counterparts in other countries who earn between $18 (Sh2,200) and $20 (Sh2,500) per hour.
Meta terminated its contract with Sama in January this year.
They moved the contract to Majorel which also offers content moderation services to companies such as TikTok in Kenya and other countries.