Home Mental Health More than 140 Kenya Facebook moderators diagnosed with severe PTSD | Digital media

More than 140 Kenya Facebook moderators diagnosed with severe PTSD | Digital media

by Universalwellnesssystems

More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder due to exposure to graphic social media content including murders, suicides, child sexual abuse, and terrorism.

Dr. Ian Kananya said the hosts worked eight to 10 hours a day at the company’s Kenyan facility, which had contracts with social media companies, and were diagnosed with PTSD, generalized anxiety disorder (GAD) and major depressive disorder (MDD). ) was found to be suffering from. Head of Mental Health Services at Kenyatta National Hospital, Nairobi.

The mass audit was part of a lawsuit filed against Facebook’s parent company Meta and the outsourcing company Samasource Kenya, which used workers from across Africa to moderate Meta’s content.

Images and videos depicting necrophilia, bestiality and self-harm caused some hosts to faint, vomit, scream and flee their desks, according to the filing.

The case highlights the human toll of a boom in social media use, often in some of the world’s poorest regions, to protect users from the worst content posted by some people. There is.

At least 40 of the moderators in the case were abusing drugs such as alcohol, cannabis, cocaine, amphetamines, and sleeping pills. Some reported that their marriages had broken down, their desire for sexual intimacy had disintegrated, and they had lost connection with their families. Some people whose job it was to remove videos uploaded by terrorists and rebels feared that they were being watched and targeted, and that they would be chased and killed if they returned home. Some were afraid.

Facebook and other major social media and artificial intelligence companies rely on armies of content moderators to remove posts that violate community standards and train their AI systems to do the same.

According to charging documents, moderators from Kenya and other African countries were tasked with checking posts originating from Africa in their own languages ​​from 2019 to 2023, but were paid eight times as much as U.S. moderators. It is said that it was one of the

A medical report submitted to Nairobi’s Employment and Labor Relations Court and seen by the Guardian paints a horrifying picture of working life inside the meth contract facility. There, workers were constantly sent images to check on in a cold storage-like space. Monitor work activity minute by minute under bright light.

Approximately 190 presenters have made a variety of claims, including allegations of intentional infliction of psychological harm, unfair employment practices, human trafficking, modern-day slavery, and illegal layoffs. All 144 people Kananya examined suffered from PTSD, GAD, or MDD, with 81% of cases exhibiting severe or very severe PTSD symptoms, most of whom had been retired for at least a year. It turns out.

Meta and Summersource declined to comment on the claims, citing the litigation.

Martha Dark, founder and co-executive director of the UK-based nonprofit Foxglove, which is supporting the lawsuit, said: “The evidence is indisputable. “It’s a dangerous job that causes lifelong PTSD in almost everyone.” that.

“In Kenya, 100% of the hundreds of former presenters tested for PTSD were traumatized…In other industries, 100% of safety workers are diagnosed with work-related illnesses. If found out, those responsible will be forced to resign. There will be legal consequences for massive violations of people’s rights. That’s why Foxglove’s brave workers are seeking justice in court. We support them.”

According to submissions in the Nairobi case, Kananya concluded that the mental health conditions of 144 people were primarily due to their work as content moderators on Facebook, and that they were responsible for “the gruesome murders. videos, self-harm, suicide, suicide attempts, sexual assault, sexually explicit content, physical and sexual abuse of children, and horrific acts of violence, to name a few.”

Four of the presenters had trypophobia. Trypophobia is an aversion or fear of repetitive patterns of small holes or protrusions that can cause severe anxiety. Some people have developed this condition after seeing holes in decomposing bodies while working on Facebook content.

Moderation and related content tagging tasks are often a hidden part of the technology boom. In a similar, but less traumatic, way, outsourced workers can be used to clean street furniture, living rooms, and roads so that an AI system designed in California knows what they’re looking at. Arrangements are being made to tag chunks of images of common things such as landscapes.

Meta said it takes the assistance of content judges seriously. Our agreements with third-party moderators of content on Facebook and Instagram set detailed expectations for counseling, training, 24-hour on-site support, and access to private healthcare. Meta claims that the pay in the marketplaces it operates is above industry standards, and that it has implemented measures such as blurring, muting sound, and rendering in black and white to limit exposure to graphic material for people reviewing content on both platforms. He said that he uses the same technology.

You may also like

Leave a Comment

The US Global Health Company is a United States based holistic wellness & lifestyle company, specializing in Financial, Emotional, & Physical Health.  

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Copyright ©️ All rights reserved. | US Global Health