abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb

Esta página no está disponible en Español y está siendo mostrada en English

Artículo

27 Abr 2025

Autor:
Claire Wilmot & Rachel Hall, The Bureau of Investigative Journalism

Ghana: East African Teleperformance content moderators at Meta’s newly outsourced hub report extreme mental health issues, suicide attempts & dismissal; incl. Co. comments

Ver todas las etiquetas Alegaciones

"Suicide attempts, sackings and a vow of silence: Meta's new moderators face worst conditions yet,"

A suicide attempt, depression, substance abuse, insomnia, surveillance, threats. These are just some of the experiences reported by the low-paid moderators tasked with sifting through Facebook and Instagram’s most disturbing images.

The tech giant Meta, which owns both platforms, has kept the whereabouts of this operation a closely guarded secret since moving it from Kenya, where the company is facing lawsuits over working conditions and human rights. For months, it has also refused to name the company that won the lucrative contract...

Now, the Bureau of Investigative Journalism (TBIJ) and the Guardian can reveal that Meta has moved this business to a new operation in the Ghanaian capital of Accra – where working conditions are said to be worse in almost every way. The company employing the moderators is Teleperformance, a French multinational with a history of controversy around workers’ rights.

Based in an anonymous office building, the 150 or so moderators spend their days reviewing content flagged on Facebook, Instagram and Messenger, including videos of extreme violence and child abuse. They say they are forced to work at a gruelling pace in order to meet a series of opaque targets that dictate whether or not they are able to scrape by in Accra.

Employees say they have developed mental illnesses and at least one was fired after advocating for better conditions. Many have come from abroad and some told lawyers that they had not spoken out for fear of being forced to return to conflict zones. They say they are instructed to tell no one, not even their families, that they are moderating Meta’s content.

...

Teleperformance told TBIJ: “We have robust people management systems and workplace practices, including a robust wellbeing program staffed by fully licensed psychologists to support our content moderators throughout their content moderation journey”. It said that “during the interview process, within the employee contract and through employee training and resilience testing, [it is] fully transparent with its prospective moderators regarding the content they might see during their work to keep the internet safe”.

Meta told TBIJ that it takes the support of content reviewers seriously and that its contracts with outsourcing companies contain expectations including counselling, training and other support.

...