Advertisement

California sues Facebook parent Meta over alleged harm to young people

Mobile phone app logos from Facebook and Instagram
Meta owns Facebook and Instagram.
(Richard Drew / Associated Press)
Share via

California and 32 other states on Tuesday sued Facebook parent company Meta over allegations that it “designed and deployed harmful features” on the main social network and its platform Instagram despite knowing about the mental health risks to young people.

“Meta has harnessed its extraordinary innovation and technology to lure youth and teens to maximize use of its products,” state Atty. Gen. Rob Bonta said at a news conference in San Francisco. “In seeking to bolster profits, Meta has repeatedly misled the public about the substantial dangers of its products.”

The 233-page lawsuit, filed in a federal court in Northern California, alleges the social media giant violated state consumer protection laws and a federal law aimed at safeguarding the privacy of children younger than 13. Other states such as Florida, Utah and Vermont filed separate lawsuits. A total of 41 states and Washington, D.C., took legal actions against Meta.

Advertisement

The legal actions highlight how states are trying to address potential mental health dangers exacerbated by social media platforms, including body image issues, anxiety and depression. Meta was compared to the tobacco industry in a separate news conference with a bipartisan group of state attorneys general including from Colorado, Tennessee, New Hampshire and Massachusetts.

“It seems to be part of a corporate playbook where there is knowledge about harms to the public and it is hidden and lied about,” Bonta said.

State attorneys general from across the nation in 2021 started investigating Meta’s promotion of its photo- and video-sharing social media platform Instagram to children and young people. Advocacy groups, lawmakers and parents have criticized Meta, alleging the multibillion-dollar company hasn’t done enough to combat content about eating disorders, suicide and other potential harms to users.

Advertisement

As part of the investigation, the state attorneys general looked at Meta’s strategies for compelling young people to spend more time on its platform. Some of those tactics include allowing users to infinitely scroll through posts on the app, luring teens to log on with constant notifications and enticing them to return to view content before it vanishes in 24 hours. The lawsuit also alleges that Meta failed to address the platform’s harms even though it knew through internal research it was potentially dangerous to teens. Features such as the “like” button could lead teens to compare the popularity of their posts to others’ and beauty filters can promote body dysmorphia, the lawsuit alleges.

Meta said it’s committed to keeping teens safe, noting it rolled out more than 30 tools to support young people and families.

“We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” a Meta spokesperson said in a statement.

Advertisement

Scrutiny over Meta’s potential damage to the mental health of young people intensified in 2021 after Frances Haugen, a former Facebook product manager, disclosed tens of thousands of internal company documents. Some of those documents included research that showed Facebook is “toxic for teen girls,” worsening body image issues and suicidal thoughts, the Wall Street Journal reported in 2021. Meta said that its research was “mischaracterized,” and that teens also reported Instagram made them feel better about other issues such as loneliness and sadness.

That year, executives from the social media company including Instagram head Adam Mosseri testified before Congress. Instagram then paused its development of a kids’ version of the app and rolled out more controls so parents could limit the amount of time teens spend on it. Social media apps like Instagram require users to be at least 13 years old, but children have lied about their age to access the platform.

Families in various states have sued Meta, blaming Instagram for worsening eating disorders and increasing suicidal thoughts among teenage girls. However, those legal actions have been impeded because Section 230 of the 1996 Communications Decency Act shields online platforms from being held liable for content posted by users. In California, tech companies and industry groups have also sued to stop new state laws aimed at protecting child safety and promoting transparency about content moderation from taking effect. While other lawsuits are still ongoing, Bonta said it’s possible the latest legal actions could help families receive monetary relief.

From the sale of deadly drugs to child sexual abuse images, social media can pose dangers. Lawmakers are targeting the platform’s algorithms, designs and features amid calls to hold tech platforms accountable for safety risks.

Through the lawsuit, California and other states are hoping to change the practices of social media companies. Platforms such as Meta could change default settings and limit how much time young people spend on the apps, Bonta said. They could also tweak how they’re recommending content to teens, which can pull young people down a rabbit hole of harmful videos and images.

The lawsuit accuses Meta of violating a federal children’s privacy law. The platform collects personal data from children without parental consent even though the social media site promotes children’s content and knows about users younger than 13, according to the lawsuit. Meta, for example, launched an ad campaign to direct teens to Instagram, which also hosted “child-oriented” content about Sesame Street, Lego and Hello Kitty on its platform, the lawsuit stated.

While more young people have defected from Facebook, Instagram remains popular among U.S. teens, according to a Pew Research Center survey released this year. About 62% of teens reported using Instagram in 2022. TikTok and Snapchat are also commonly used by teens.

Advertisement

About 22 million U.S. teens log on to Instagram every day, according to the lawsuit.

The amount of time teens spend on social media has been a growing concern especially as platforms use algorithms to recommend content they think users like to view. In 2022, attorneys general across the country started investigating TikTok’s potential harm to young people as well.

As social media platforms face more lawsuits that can drag on for years, technology continues to evolve rapidly. Meta has been doubling down on virtual reality and artificial intelligence that can generate content.

Bonta said that if state attorneys general need to amend the complaint in the future, that might be a move worth considering.

“We’re focused on the well-documented practices that have caused the harm that leads us to today,” he said.

VIDEO | 05:43
LA Times Today: California sues Facebook parent Meta over alleged harm to young people

Watch L.A. Times Today at 7 p.m. on Spectrum News 1 on Channel 1 or live stream on the Spectrum News App. Palos Verdes Peninsula and Orange County viewers can watch on Cox Systems on channel 99.

Advertisement