CNN
—
Activists say that after more than 140 people in Kenya were diagnosed with PTSD and other mental health conditions, Facebook’s parent company, Meta, has instructed hundreds of content moderators to “injure themselves with potentially lifelong trauma.” ”.
The diagnosis was made by Dr Ian Kananya, head of mental health services at Kenyatta National Hospital in Nairobi, Kenya’s capital, and was submitted to the city’s Employment and Labor Relations Tribunal on December 4.
The medical report was submitted to the court by law firm Njiri & Sumbi Associates as part of an ongoing lawsuit against Mehta and Samasource Kenya. Somesource Kenya is an outsourcing company that was contracted to review content for the technology giant.
Content moderators help technology companies remove disturbing content on their platforms, and are routinely managed by third-party companies, often in developing countries. Critics have long raised concerns about the impact of the production on the host’s mental health.
Meta declined to comment on the medical report due to ongoing litigation, but says it takes moderator support seriously and that its contracts with third-party companies include expectations for counseling, training, and fair pay. He said it was prescribed.
The spokesperson added that moderators can customize “content review tools” to blur graphic content or display it in black and white, for example.
SamaSource, now known as Sama, did not respond to requests for comment.
Kananya said the moderators he rated “videos of gruesome murders, self-harm, suicide, suicide attempts, sexual violence, sexually explicit content, physical and sexual abuse of children, and horrific acts of violence. “I encountered extremely graphic content on a daily basis,” he said. some. ”
Of the 144 content moderators who volunteered to undergo psychiatric evaluation (out of 185 involved in legal claims), 81% were classified as suffering from “severe” PTSD, Kanyaniya said.
This class action lawsuit is an outgrowth of an earlier lawsuit. Released in 2022 A lawsuit filed by a former Facebook moderator claims that the employee was illegally fired from Somesource Kenya after organizing a protest against unfair working conditions, according to Foxglove, a British non-profit organization that is supporting the case. It is claimed that
Last year, all 260 content moderators working at SummerSource Kenya’s moderation hub in Nairobi were fired and “punished” for raising concerns about pay and working conditions, Foxglove said.
According to court documents, the moderators involved in the case worked for Somersource Kenya from 2019 to 2023.
In one medical record reviewed by CNN, a content moderator said he frequently woke up in a cold sweat from nightmares related to graphic content he reviewed at work. As a result, they added, he suffered frequent breakdowns, vivid flashbacks and paranoia.
Another former content moderator said she developed a “fear of seeing dotted patterns” known as trypophobia after seeing images of maggots crawling out of decomposing human hands.
“Facebook moderation is dangerous, even deadly, and can cause lifelong PTSD for nearly everyone who moderates Facebook,” said Martha Dark, co-executive director of Foxglove. said.
“In Kenya, 100% of hundreds of former moderators tested for PTSD were traumatized… Facebook has left hundreds of people, usually young people just finishing their education, with potentially lifelong trauma. “I am responsible for inflicting this on them,” she said in a statement provided to CNN. Friday.
Dirk believes that if these diagnoses were made in other industries, those responsible would be “forced to resign and face legal liability for massive violations of people’s rights.”
This isn’t the first time content moderators have taken legal action against the social media giant, claiming the job was traumatic.
In 2021, TikTok’s content moderators sued the social media platform, claiming they suffered psychological trauma as a result of their work.
The following year, TikTok was hit with another lawsuit from former content moderators.