Facebook rebranded to Meta in 2021/ Reuters
NAIROBI, Kenya May 11- A former content moderator working for American social media giant Meta, formerly Facebook on Tuesday filed a lawsuit accusing the firm of violating a number of employee rights in its Kenyan hub.
The petition was also filed against Meta’s local subcontractor Sama, alleging that the firm’s content moderators in Kenya were being subjected to unreasonable working conditions including irregular pay, inadequate mental health support, union-busting, and violations of their privacy and dignity.
The petition comes weeks after Nzili and Sumbi Advocates threatened to take legal action against the two firms on behalf of former Facebook content moderator Daniel Motaung.
The lawsuit, filed by Motaung on behalf of a group, seeks financial compensation, an order that outsourced moderators have the same health care and pay scale as Meta employees, that unionization rights be protected, and an independent human rights audit of the office.
“Facebook subcontracts most of this work to companies like Sama – a practice that keeps Facebook’s profit margins high but at the cost of thousands of moderators’ health – and the safety of Facebook worldwide. Sama moderators report ongoing violations, including conditions which are unsafe, degrading, and pose a risk of post-traumatic stress disorder (PTSD),” the law firm stated.
A scathing article by the Times dubbed “Inside Facebook’s African Sweatshop” released in February revealed that Sama allegedly fired its former employee Daniel Motaung for leading a strike back in 2019 over poor pay and work conditions.
The article detailed how Sama recruited its content moderators under the false pretext that they were taking up call centre jobs.
The story further stated that the content moderators, hired from across Africa, only learned about the nature of their jobs after signing employment contracts and relocating to Sama’s hub based in Kenya’s capital, Nairobi.
For a lot of social media platforms, content moderators perform the brutal task of viewing and removing illegal or banned content from these sites before any of it is seen by the average user.
Globally, thousands of moderators review posts that could depict violence, nudity, racism, or other offensive content. Many work for third-party contractors rather than the tech companies themselves.
“If in Dublin, people can’t look at harmful content for two hours, that should be the rule everywhere,” Motaung’s lawyer Mercy Mutemi said. “If they need to have a psychologist on call that should apply everywhere.”
Meta has however distanced itself from claims that its main subcontractor for content moderation in Africa, Sama, allegedly violated employee rights in its Kenyan hub.
Meta also said it was not privy to the arrangement its subcontractor had with Moutang. “There was, therefore, no employer/employee relationship between Meta and the Claimant (Motaung), upon which a cause of action may be premised. No action can therefore be brought against Meta for any rights and /or obligations allegedly due to owing to the Claimant with respect to his employment with Sama, as Meta is not and has never been his employer,” said Anjaarwalla & Khanna LLP, the law firm representing Meta.
Following the expose, Sama denied any wrongdoing stating that it is transparent during its hiring process and has a culture that “prioritizes employee health and wellness.”
“We understand that content moderation is a difficult but essential job to ensure the safety of the internet for everyone, and it’s why we invest heavily in training, personal development, and wellness programs,” said Sama.
The firm is yet to make any statement following the lawsuit.
Meta has already faced scrutiny over content moderators’ working conditions.
Last year, a California judge approved an $85 million settlement between Facebook and more than 10,000 content moderators who had accused the company of failing to protect them from psychological injuries resulting from their exposure to graphic and violent imagery.
Facebook did not admit wrongdoing in the California case but agreed to take measures to provide its content moderators, who are employed by third-party vendors, with safer work environments.