A recent court ruling in Kenya last month declared Meta, the parent company of Facebook and Instagram, the “actual employer” of the myriad content moderators based in Nairobi working tirelessly to filter out violence, hate speech, and other disturbing content from its platforms. This landmark ruling now opens the door for Meta to face lawsuits in Kenya for labor rights infringements, even though the moderators are technically hired through a third-party contractor.
Watching the case with vested interest was social media titan TikTok, which also employs outsourced moderators in Kenya and other developing nations, through a partnership with Luxembourg-based Majorel. Documents leaked and acquired by the NGO Foxglove Legal, and reviewed by WIRED, reveal that TikTok is worried about the potential of facing similar lawsuits.
The internal memo reveals, “TikTok will likely encounter reputational and regulatory challenges due to its contract agreement with Majorel in Kenya.” It cautions that if Kenyan courts rule in favor of the moderators, “TikTok and its rivals could potentially face scrutiny for apparent or real labor rights violations.”
The verdict against Meta unfolded after the tech giant sought to dismiss a case filed by South African moderator, Daniel Motaung, and its outsourcing associate, Sama. Motaung was dismissed after attempting to establish a union in 2019.
Motaung’s allegations include enduring hours of violent and graphic content daily, leading to post-traumatic stress disorder. He also claims that he was not fully informed about the work’s nature before relocating from South Africa to Kenya. Motaung’s accusations against Meta and Sama include several violations of Kenyan labor law, such as human trafficking and union busting. If Motaung’s case is successful, it could set a precedent for holding other large tech companies accountable for their outsourced staff’s treatment in Kenya, and perhaps even provide a blueprint for similar cases worldwide.
“[TikTok] perceives it as a reputational threat,” says Cori Crider, director of Foxglove Legal. “The real reputational threat is that they are exploiting people.”
TikTok did not provide a response to a request for comment.
In January, amid Motaung’s ongoing lawsuit, Meta attempted to sever ties with Sama and transfer its outsourcing operations to Majorel—TikTok’s collaborator.
During this transition, around 260 Sama moderators were anticipated to lose their jobs. In March, a judge issued an injunction preventing Meta from ending its contract with Sama and transitioning to Majorel until the court could determine if the layoffs violated Kenyan labor laws. In another lawsuit, Sama moderators claimed Majorel had barred them from applying to the new Meta moderator positions in retaliation for their push for improved working conditions at Sama. In May, 150 outsourced moderators working for TikTok, ChatGPT, and Meta through third-party firms decided to establish and register the African Content Moderators Union.
Majorel declined to comment.
The leaked TikTok documents indicate that the company is contemplating an independent audit of Majorel’s site in Kenya. Majorel operates sites globally, including in Morocco, where its moderators work for both Meta and TikTok. The memo suggests that this kind of exercise, typically involving an external law firm or consultancy conducting interviews and providing a formal evaluation against criteria like local labor laws or international human rights standards, “may offset additional scrutiny from union representatives and news media.”
Paul Barrett, deputy director of the Center for Business and Human Rights at New York University, explains that these audits can provide companies a façade of action towards improving conditions in their supply chain, without necessitating any significant changes.
The leaked TikTok memo provides no details about how the company might leverage such an assessment to guide improvements in its outsourced workers’ conditions. “It’s about the performance of doing something,” says Crider.
Barrett suggests that there’s an opportunity for TikTok to address the issue more proactively than its predecessors. “I think it would be very unfortunate if TikTok said, ‘We’re going to try to minimize liability, minimize our responsibility, and not only outsource this work, but outsource our responsibility for ensuring the work being done on behalf of our platform is done in a suitable and humane way.’”