@NCCapitol

To protect teens, NC lawmakers propose new social media regulations

A bipartisan group of lawmakers proposed new regulations for how social media platforms such as Facebook and Instagram interact with minors.
Posted 2023-04-19T22:08:23+00:00 - Updated 2023-04-19T22:08:23+00:00
The logo of Meta Platforms' business group is seen in Brussels, Belgium December 6, 2022. Meta is testing a subscription service which will allow Instagram and Facebook users to pay to get verified, Mark Zuckerberg announced on Instagram Sunday. (Yves Herman/Reuters)

Children in North Carolina may soon receive more social media protections.

A bipartisan group of lawmakers on Wednesday proposed new regulations for how social media companies such as Facebook and Instagram interact with minors on their platforms. State Reps. Jeff McNeely, R-Iredell, and Mujtaba Mohammed, D-Mecklenburg, stood with youth advocates during a press conference and said their bill is aimed at curbing social media addiction and the negative side effects that come with it.

“Unhealthy use of social media can be found to lead to depression, anxiety, eating disorders and even suicidal thoughts and actions. So this has become almost an epidemic,” McNeely said.

The effort is supported by the NC Young People’s Alliance, a student-led group advocating for youth issues across NC, as well as the NC Association of Educators and other groups.

House Bill 644 would prohibit social media platforms from targeting minors with advertisements that aren’t based on the user’s internet searches. It would only allow a young person’s data to be used or shared if the person consents. It would also require the companies to introduce an age verification system to their platform.

A spokesperson for Meta --- which owns Facebook and Instagram --- provided a statement saying the company is already taking steps to protect teens on its website.

“We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks,” the Meta statement said. “We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us.”

The company added that it plans to continue working closely with lawmakers on the issue.

Credits