A polarized Congress is forming a rare bipartisan consensus to protect the mental health of young people with a slew of bills aimed at regulating tech companies.
The growing effort by lawmakers to target social media companies comes alongside warnings from the U.S. Surgeon General about the impact of platforms on the mental health of minors. In May, Dr Vivek Morsi called on tech companies and lawmakers to take “immediate action” to protect the mental health of children. After Issuance of Public Health Advisory It highlights the impact of social media use on young people in this country.
In a report released in February, the Centers for Disease Control and Prevention found: Teenage girls were experiencing record highs Violence, grief and suicide are at high risk, with nearly three in five feeling continued sadness and hopelessness in 2021. This represents a nearly 60% increase, reportedly the highest level in the last decade.
While addressing the growing mental health crisis in the United States, especially among young people, is a priority for Washington, there is still no consensus on how to govern social media companies. Bills aimed at holding tech companies accountable through content moderation and fines have received more support than bills banning minors from using platforms.
Lawmakers seek to hold tech companies accountable
Lawmakers seeking to ban children from certain social media platforms entirely have faced some criticism.
“We don’t believe that legislation that keeps children off social media is necessarily realistic, and more than that, we believe that these bills will fundamentally push the problem back under the feet of parents and young people.” I think,” said executive director Josh Gorin. A member of the child safety organization Fairplay told USA TODAY.
“What we need is legislation that really changes how these platforms engage with young people,” he added.
The Kids Online Safety Act, also known as KOSA, is a bipartisan effort reintroduced earlier this year by Tennessee Republican Sen. One of the bills.
The bill would “provide families with the tools, safeguards and transparency they need to protect against threats to their children’s health and well-being online.” Platforms will also be required to “put the interests of children first.” According to a note on the bill.
according to the textunder the bill:
- Social media platforms will be required to give minors options to protect their information. Disable addictive product features such as rewards for time spent on the platform, media autoplay, and opting out of recommendations from algorithms.
- Parents will have access to new controls to detect harmful behavior, including a dedicated channel for reporting harmful content to the platform.
- Platforms will be required to conduct annual independent audits to assess risks to minors, compliance with laws, and whether the platform has taken steps to prevent harmful effects, including sexual exploitation and abuse.
- Social media platforms are responsible for preventing and mitigating harm and harmful content to minors, including violence and drug and alcohol promotion, and are subject to penalties for violations.
Advocates, including Gorin, have rallied in support of the Children and Youth Online Privacy Protection Act, also known as COPPA 2.0.
The bill would update current online data privacy rules to help combat the youth mental health crisis by outlawing the targeting of minors through algorithms and harmful content. is intended for According to a May press release.
The CDC found a significant increase in mental health problems among teenage girls, but all teens reported increases, including experiences of violence and suicidal thoughts and behaviors. It was also revealed that there are According to data from the 2021 Youth Risk Behavior Survey.
COPPA 2.0 prohibits technology companies from collecting information about users between the ages of 13 and 16 without their consent under the Children’s Online Privacy Protection Regulation (COPPA). According to the bill.
The bill would also:
- Prohibits advertising targeted to minors.
- Where possible, we require companies to allow users to delete personal information of minors.
- Enact a Digital Marketing Bill of Rights for Youth that limits the collection of personal information about young people.
Problems that Congress cannot solve on its own
Despite widespread support for Congress’ action, some parents and educators, including Chicago high school teacher Max Bean, are not convinced Congress’ action is sufficient.
For Bean, exposure to harmful content and data privacy are not primary concerns. Rather, they are concerned about the impact of social media replacing face-to-face interactions and the decline in face-to-face contact.
“The problem here is a complete shift in how humans interact, and you know removing harmful content doesn’t solve the problem. I think humans need to interact face-to-face.” Bean, 41, told USA TODAY, adding that along with the law, there needs to be a social change in how social media is used.
The high school math and physics teacher is skeptical that Congress’ efforts can make a difference, but sees Washington’s actions as a first step in addressing the harmful effects of social media. there is
“I don’t think Congress can solve this problem alone. But if Congress takes action, it will encourage others to take action, and I think it’s a step,” Bean said.
Bean’s perspective is shared by Kylan Carr, a 39-year-old mother of two from Bakersfield, California.
“I think Congress has a role in this conundrum,” Carr told USA TODAY. “Congress, I feel we need to step in and hold these tech giants accountable, but I also feel we as a society need to de-emphasize social media.”