@NCCapitol

'Meta knew it was harming our kids': Stein explains why he and other AGs sued Instagram parent company

North Carolina Attorney General Josh Stein filed a lawsuit against Meta, the company that owns Instagram, alleging that it purposefully designed the social media platform to hook children and teenagers.
Posted 2023-10-24T18:52:47+00:00 - Updated 2023-10-24T23:30:36+00:00
NC AG Josh Stein speaks about how he's trying to help protect kids online

North Carolina and 32 other states sued Meta Platforms Inc., alleging that the social media company has contributed to a youth mental health crisis by knowingly designing features on its Instagram and Facebook platforms that addict children to its platforms.

The lawsuit, filed in federal court in California, also claims that Meta routinely collects data on children younger than 13 without their parents' consent, in violation of federal law.

“The more time kids spend on the app, the more addicted our kids become and the more money Meta makes,” North Carolina Attorney General Josh Stein said Tuesday during a news conference at the Charlotte-Mecklenburg Government Center, which was streamed on the Facebook page of the state Department of Justice and attorney general’s office.

“Meta knew it was harming our kids,” Stein added. “And Meta lied to parents and the public about the risk its social media platforms posed against our young people.”

In a statement, Meta said it shares “the attorneys general’s commitment to providing teens with safe, positive experiences online, and have already introduced over 30 tools to support teens and their families.”

“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company added.

Stein spoke about his reason to file the lawsuit in an interview with WRAL News. He said he's investigated Meta for two years.

"Meta, through Instagram, didn't look as children as young people to be protected," Stein said. "They looked at them as dollar signs to be exploited."

Stein said young people are experiencing depression, anxiety, drug misuse, addiction, self injury and suicide at alarming rates, and he called social media a root cause of what the U.S. Surgeon General describes as a youth mental health crisis.

“Instagram and other social media platforms are keeping kids on the apps for hours at a time,” Stein said. “It is not an accident.

“It is the goal of Meta … because the longer the kids are on the apps, the more data Meta can take from them and the more ads that Meta can serve them,” he said. “In other words, the more money that Meta can make from them.”

The broad-ranging lawsuit is the result of an investigation led by a bipartisan coalition of attorneys general from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont.

"The best tool that we have, as attorney generals, is to identify companies that break the law, take them to court and hold them accountable," Stein said.

It follows damning newspaper reports, first by The Wall Street Journal in the fall of 2021, based on the Meta's own research that found that the company knew about the harms Instagram can cause teenagers — especially teen girls — when it comes to mental health and body image issues. One internal study cited 13.5% of teen girls saying Instagram makes thoughts of suicide worse and 17% of teen girls saying it makes eating disorders worse.

Stein said Meta designed Instagram to “induce its users to spend more and more of their lives clicking and scrolling.”

“Sadly, it’s working,” Stein said. “About 22 million teens log on to Instagram in the United States every day, some for extreme lengths of time.”

The use of social media among teens is nearly universal in the U.S. and many other parts of the world. Up to 95% of youth ages 13 to 17 in the U.S. report using a social media platform, with more than one-third saying they use social media "almost constantly," according to the Pew Research Center.

To comply with federal regulation, social media companies ban kids under 13 from signing up to their platforms — but children have been shown to easily get around the bans, both with and without their parents' consent, and many younger kids have social media accounts.

Other measures social platforms have taken to address concerns about children's mental health are also easily circumvented. For instance, TikTok recently introduced a default 60-minute time limit for users younger than 18. But once the limit is reached, minors can simply enter a passcode to keep watching.

“They knew the content they were exposing kids to was dangerous,” Stein said of Meta. “They could’ve chosen to stop to revisit their policies and the platform designs, to do better by children and families, but they didn’t because they wanted to make another dollar.”

Stein, a Democratic candidate for governor, said it’s not just the time kids are spending on social media, but the content as well. It includes sexually-explicit content, bullying, eating disorders and drug misuse.

On Tuesday, parent Sherry Johnson spoke alongside Stein.

“Even with families that are aware and super vigilant, children can be exposed to so much more than their families are actually aware of,” Johnson said. “And, kids are not in a place to know what that sort of exposure is doing to them.”

The Associated Press contributed to this report.

Credits