Home Mental Health Leaked documents reveal Meta knew Instagram was pushing girls towards harmful content that harmed mental health- Technology News, Firstpost

Leaked documents reveal Meta knew Instagram was pushing girls towards harmful content that harmed mental health- Technology News, Firstpost

by Universalwellnesssystems

Elon Musk’s Twitter story and Twitter file showed us, content moderation by its social media platforms is not easy.Social media Platforms like Instagram And Facebook needs to strike a balance between making users’ feeds as engaging as possible and keeping users, especially susceptible ones, away from harmful content. This is where most social media platforms fail miserably.

Whether Twitter, Instagram or Facebook, content moderation is profit driven. It is also guided more by ideology than by set policy, as Twitter’s files show. Instagram, in particular, is often accused of being profit-driven when it comes to moderating content. Image Credit: AFP

According to previously unpublished documents not leaked from Meta, those who headed Meta back in the days when it was still called Facebook said Instagram intentionally made young teenage girls dangerous and harmful. It shows that you knew you were navigating to the content and did nothing to stop it.

The document reveals how an Instagram employee pretended to be a 13-year-old girl looking for diet tips to conduct research on Instagram’s algorithm and recommendations. Instead of showing users her content by medical or proper fitness professionals, the algorithm chose to show content on viral her topics that garnered more engagement. This was next to a proper meal. These “adjacent” viral topics turned out to be content about anorexia. Users were graphically guided by her content and recommendations to follow accounts titled “skinny binge” and “apple core anorexic.”

About 33% of all teens feel bad about their bodies because of the app’s recommended content and the algorithms Insta uses to curate its users’ feeds. What we were aware of is a known fact. Instagram was also aware that teens who used the app experienced higher rates of anxiety and depression.

This isn’t the first time Instagram’s algorithm and the content it pushes to users have been the subject of controversy among mental health professionals and advocates. Instagram officially listed as cause of death A case involving a 14-year-old girl named Molly Russell who committed suicide in 2017 by a British coroner.

In the case of Molly Russell, one of the key areas the trial focused on was that Molly had seen thousands of posts encouraging self-harm on platforms like Instagram and Pinterest, and that she had committed suicide. Andrew Walker, in his testimony as coroner, concluded that Russell’s death was not a suicide. Instead, he described her cause of death as “self-harm while struggling with depression and the negative effects of online content.” He said that he was very concerned about the content he had saved and saved, and that he could hardly see it.

“Platforms use algorithms to behave in ways that, in some situations, result in periods of bingeing of images, video clips, and text,” “romanticize self-harm,” and “isolate others.” “I could have helped,” Walker said.

Cases like this have sparked debate about the content moderation policies of social media platforms and how they are deployed in practice. His attorney, Matt Bergman, launched the Social Media Victims Law Center after reading Facebook Papers published last year by whistleblower Frances Haugen. He is currently working with over 1,200 of his family members in their lawsuits against social media companies.

“Time and again, when children are given the opportunity to choose between their safety and their interests, they always choose the interest,” Bergman said in an interview with the US news agency. He argues that the design of social media platforms ultimately hurts children.

“They designed the product to be addictive on purpose,” Bergman said. “They understand that if their kids stay on the internet, they can make more money. It doesn’t matter how harmful the material is.” It claims to be explicitly designed and calls for better age and identity verification protocols.

Antigone Davis, Meta’s global head of safety, said, “We want teens to be safe online,” adding that Instagram “encourages self-harm and eating disorders.” Davis also said Meta has improved Instagram’s “age verification technology.”

Several activists and advocates are of the opinion that content moderation across platforms needs an overhaul. The larger consensus is that social media platforms need independent moderation councils and that content itself should be regulated, but there are larger, global bodies that set policies for content moderation. Some state that it is necessary.

Removing content moderation from the platform and assigning an independent council to oversee the moderation policies of all social media platforms opens a whole new can of worms. For example, it will be much easier for the regime to suppress political dissidents and news that may be against the regime.What is this Exactly what Twitter Files is trying to displayHowever, the fact remains that content moderation as we know it is broken and needs to be fixed.

You may also like

Leave a Comment

The US Global Health Company is a United States based holistic wellness & lifestyle company, specializing in Financial, Emotional, & Physical Health.  

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Copyright ©️ All rights reserved. | US Global Health