TikTok acknowledges labor problem in leaked memo.

TikTok acknowledges labor problem in leaked memo.

The Dark Side of Content Moderation: TikTok Faces Potential Labor Rights Litigation

Image

Last month, a groundbreaking ruling in Kenya sent shockwaves through the tech industry as a court declared that Meta, the parent company of Facebook and Instagram, was the “true employer” of its Kenyan content moderators. This decision has important implications for Meta, as it means they can be held accountable for labor rights violations, even though the moderators are technically employed by a third-party contractor. The ruling has caught the attention of social media giant TikTok, which also outsources moderation work to Kenya and is concerned about potential litigation.

The leaked documents obtained by NGO Foxglove Legal, seen by WIRED, reveal TikTok’s unease about its contractual arrangement with Majorel, headquartered in Luxembourg. TikTok’s concern stems from the possibility of facing reputational and regulatory risks if the Kenyan courts rule in favor of the moderators. The memo warns that this could lead to a scrutiny of TikTok and its competitors for alleged labor rights violations.

The landmark ruling against Meta came after South African moderator Daniel Motaung filed a lawsuit against Meta and its outsourcing partner, Sama, for unfair dismissal. Motaung, who was fired after attempting to form a union in 2019, states that his work as a content moderator, constantly exposed to violent and traumatizing content, led to him developing post-traumatic stress disorder. He also alleges that he was not fully informed about the nature of the work when he relocated from South Africa to Kenya. Motaung accuses Meta and Sama of multiple abuses of Kenyan labor law, including human trafficking and union busting. A successful outcome for Motaung’s case could pave the way for other tech companies outsourcing to Kenya to be held accountable for the treatment of their staff and set a precedent for similar cases in other countries.

Cori Crider, director of Foxglove Legal, highlights TikTok’s concerns, stating, “[TikTok] reads it as a reputational threat… The fact that they are exploiting people is the reputational threat.” However, TikTok has declined to comment on the matter.

In January, while Motaung’s case was progressing, Meta attempted to sever ties with Sama and shift its outsourcing operations to Majorel—the same partner as TikTok. This move would have resulted in the dismissal of 260 Sama moderators. However, a judge issued an injunction in March preventing Meta from terminating its contract with Sama until the court could determine if the layoffs violated Kenyan labor laws. In a separate lawsuit, Sama moderators alleged that Majorel had blacklisted them from applying for new Meta moderator positions as a response to their efforts to improve working conditions at Sama. In May, 150 outsourced moderators working for TikTok, ChatGPT, and Meta formed and registered the African Content Moderators Union.

Majorel, the company at the heart of the controversy, declined to comment on the situation.

The leaked TikTok documents outline the company’s consideration of an independent audit of Majorel’s site in Kenya, as well as in Morocco, where moderators work for both Meta and TikTok. Conducting such an audit, which typically involves hiring an external law firm or consultancy to assess adherence to local labor laws and international human rights standards, may mitigate scrutiny from union representatives and the media, as stated in the memo. However, Paul Barrett, the deputy director of the Center for Business and Human Rights at New York University, warns that audits can often be mere gestures without bringing about substantial changes.

Meta has conducted multiple audits in the past, including one by consultants Business for Social Responsibility in 2018, in response to allegations of hate speech contributing to a genocide in Myanmar. Nevertheless, Meta has yet to release the full, unredacted copy of its human rights impact report on India, commissioned in 2019 following accusations from rights groups about the erosion of civil liberties in the country.

Despite considering an audit, TikTok’s memo fails to acknowledge how such an assessment could lead to tangible improvements in the working conditions of its outsourced moderators. Cori Crider points out that the recommendations do not address fundamental issues such as providing access to psychiatrists, allowing moderators to opt out of content exposure, fairer compensation that acknowledges the hazards of the job, or implementing stringent pre-screening procedures. Instead, she believes the focus is on the performance and appearance of doing something.

Barrett suggests that TikTok has an opportunity to take a more proactive approach compared to its predecessors, saying, “It would be very unfortunate if TikTok said, ‘We’re going to try to minimize liability, minimize our responsibility, and not only outsource this work, but outsource our responsibility for making sure the work that’s being done on behalf of our platform is done in an appropriate and humane way.’”

The outcome of the ongoing legal battles and any potential labor rights lawsuits against TikTok will undoubtedly shape the future of content moderation and the responsibilities of tech giants towards their outsourced workers. As the world becomes increasingly interconnected through social media, ensuring the well-being and fair treatment of those who moderate user-generated content takes on heightened importance for the industry.