FILE – This photo shows the Facebook and Instagram mobile app logos from left on October 5, 2021 in New York. A team of the world's leading social media researchers has published four studies examining the relationships between algorithms. It's used by Facebook, Instagram, and America's growing political divide. (AP Photo/Richard Drew, File)
SAN FRANCISCO (AP) — Meta Inc. said Tuesday it will begin hiding inappropriate content from teens' accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders. did.
Social media giant based in Menlo Park, California mentioned in a blog post We already aim to discourage this kind of “age-inappropriate” material from being recommended to teens, but from now on it won't show up in your feed even if it's shared by an account you follow. It is said that it will disappear.
“We want teens to have a safe and age-appropriate experience with our apps,” Mehta said.
If teenagers don't lie about their age when signing up for Instagram or Facebook, their accounts will be placed in the most restrictive settings on the platforms, which will prevent them from searching for potentially harmful terms. You will be blocked.
“Consider the example of someone who posted about their ongoing struggle with thoughts of self-harm. This is an important story and helps destigmatize these issues, but it is complex. themes and are not necessarily suitable for all young people,” Mehta said. “Going forward, we'll begin removing this type of content, along with other types of age-inappropriate content, from teens' experiences on Instagram and Facebook.”
Meta's announcement comes as the company faces a lawsuit. Dozens of US states The company accuses the company of harming young people and contributing to the youth mental health crisis by intentionally and purposefully designing features on Instagram and Facebook that addict children to its platforms.
Critics said Mr. Mehta's move did not go far enough.
“Meta's announcement today is yet another desperate attempt to circumvent regulation and threatens to protect children from online victimization on Instagram,” said Josh Golin, executive director of Fairplay, an online children's advocacy organization. “This is an incredible slap in the face to the parents who lost their children.” “If the company could hide content that promotes suicide and eating disorders, why did it wait until 2024 to announce these changes?”