Mehr Reuben DasDec 12, 2022 14:59:03 IST
Elon Musk’s Twitter story and Twitter file showed us, content moderation by its social media platforms is not easy.Social media Platforms like Instagram And Facebook needs to strike a balance between making users’ feeds as engaging as possible and keeping users, especially susceptible ones, away from harmful content. This is where most social media platforms fail miserably.
According to previously unpublished documents not leaked from Meta, those who headed Meta back in the days when it was still called Facebook said Instagram intentionally made young teenage girls dangerous and harmful. It shows that you knew you were navigating to the content and did nothing to stop it.
The document reveals how an Instagram employee pretended to be a 13-year-old girl looking for diet tips to conduct research on Instagram’s algorithm and recommendations. Instead of showing users her content by medical or proper fitness professionals, the algorithm chose to show content on viral her topics that garnered more engagement. This was next to a proper meal. These “adjacent” viral topics turned out to be content about anorexia. Users were graphically guided by her content and recommendations to follow accounts titled “skinny binge” and “apple core anorexic.”
About 33% of all teens feel bad about their bodies because of the app’s recommended content and the algorithms Insta uses to curate its users’ feeds. What we were aware of is a known fact. Instagram was also aware that teens who used the app experienced higher rates of anxiety and depression.
This isn’t the first time Instagram’s algorithm and the content it pushes to users have been the subject of controversy among mental health professionals and advocates. Instagram officially listed as cause of death A case involving a 14-year-old girl named Molly Russell who committed suicide in 2017 by a British coroner.
In the case of Molly Russell, one of the key areas the trial focused on was that Molly had seen thousands of posts encouraging self-harm on platforms like Instagram and Pinterest, and that she had committed suicide. Andrew Walker, in his testimony as coroner, concluded that Russell’s death was not a suicide. Instead, he described her cause of death as “self-harm while struggling with depression and the negative effects of online content.” He said that he was very concerned about the content he had saved and saved, and that he could hardly see it.
“Platforms use algorithms to behave in ways that, in some situations, result in periods of bingeing of images, video clips, and text,” “romanticize self-harm,” and “isolate others.” “I could have helped,” Walker said.
Cases like this have sparked debate about the content moderation policies of social media platforms and how they are deployed in practice. His attorney, Matt Bergman, launched the Social Media Victims Law Center after reading Facebook Papers published last year by whistleblower Frances Haugen. He is currently working with over 1,200 of his family members in their lawsuits against social media companies.
“Time and again, when children are given the opportunity to choose between their safety and their interests, they always choose the interest,” Bergman said in an interview with the US news agency. He argues that the design of social media platforms ultimately hurts children.
money & control
Facebook and Instagram allow celebrities and politicians to break their content moderation rules. #1A daily mail online https://t.co/b0h1xlDmsp— Cyberchick (@warriors_mom) December 7, 2022
“They designed the product to be addictive on purpose,” Bergman said. “They understand that if their kids stay on the internet, they can make more money. It doesn’t matter how harmful the material is.” It claims to be explicitly designed and calls for better age and identity verification protocols.
Antigone Davis, Meta’s global head of safety, said, “We want teens to be safe online,” adding that Instagram “encourages self-harm and eating disorders.” Davis also said Meta has improved Instagram’s “age verification technology.”
Several activists and advocates are of the opinion that content moderation across platforms needs an overhaul. The larger consensus is that social media platforms need independent moderation councils and that content itself should be regulated, but there are larger, global bodies that set policies for content moderation. Some state that it is necessary.
I hope that the idea that objectivity and standards apply to content moderation on Twitter is completely misunderstood. or elsewhere. They just make it up and call you a fool for questioning it. https://t.co/Vy3jexpiTe
— Rachel Bovard (@rachelbovard) December 11, 2022
Today I filed my first amended complaint @twitter , @Instagram When @meta For its blatant failure to enforce its own content moderation policy against the Ayatollah and his terrorist regime. https://t.co/WhSrGZyWtO
— Rumi Parsa (@rumiparsa) December 5, 2022
Removing content moderation from the platform and assigning an independent council to oversee the moderation policies of all social media platforms opens a whole new can of worms. For example, it will be much easier for the regime to suppress political dissidents and news that may be against the regime.What is this Exactly what Twitter Files is trying to displayHowever, the fact remains that content moderation as we know it is broken and needs to be fixed.