Advertisement

Meta will limit reach of harmful content to teens on Facebook and Instagram amid scrutiny

The Meta logo
Meta said it will restrict inappropriate content for teenagers’ accounts on Instagram and Facebook, such as posts about suicide, self-harm and eating disorders.
(Thibault Camus / Associated Press)
Share via

Facing increased scrutiny over its social networks’ effects on teenage users, Meta announced Tuesday that teens on Facebook and Instagram will see less content related to self-harm and eating disorders.

Meta already filters such content out of the feeds it recommends to users, such as Instagram’s Reels and Explore. But under a set of changes rolling out over the next few months, harmful posts and stories won’t be shown to teens “even if [they’re] shared by someone they follow,the company said in a statement.

The harmful topics include suicide, self-harm, eating disorders, restricted goods — including firearms, drugs and sex toys — and nudity.

Advertisement

Another change will automatically set users under 18 to the most restrictive content recommendation settings, with the goal of making it less likely that harmful content will be recommended to them by Meta’s algorithms. It’s not clear, however, whether teens could simply change their settings to remove the restrictions.

Adolescents who have dealt with cyberbullying and body dysmorphia due to their time spent online offer tips for their peers on how to navigate these spaces.

The company says the apps’ search functionality will be limited on queries related to harmful topics. Instead of providing the requested content, the apps will direct users to get help when they search for content related to suicide, self-harm and eating disorders.

Teen users will also be prompted to update their privacy settings, the statement said.

The changes are necessary to help make “social media platforms [into] spaces where teens can connect and be creative in age-appropriate ways,” said Rachel Rodgers, an associate professor in the Department of Applied Psychology at Northeastern University.

Advertisement

Facebook and Instagram have been tremendously popular with teenagers for years. The platforms have drawn concern from parents, experts and elected officials over the effects on younger users, in part because of what these users see and in part because of the amount of time they spend on the networks.

Experts convened by the National Academies say they didn’t find evidence to support broad restrictions on young people’s access to social media.

U.S. Surgeon General Vivek Murphy warned in May that because the effects of social media on kids and teens were largely unknown, the companies needed to take “immediate action to protect kids now.”

In October, California joined dozens of other states in a lawsuit against Meta claiming that the company used “psychologically manipulative product features” to attract young users and keep them on the platforms for as long as possible.

Advertisement

“Meta has harnessed its extraordinary innovation and technology to lure youth and teens to maximize use of its products,” state Atty. Gen. Rob Bonta said in a news conference announcing the suit.

In November, an unredacted version of the lawsuit revealed an allegation that Mark Zuckerberg vetoed a proposal to ban camera filters from the apps that simulate the effects of plastic surgery, despite concerns that the filters could be harmful to users’ mental health.

After the unredacted complaint was released, Bonta was more emphatic: “Meta knows that what it is doing is bad for kids — period,” he said in a statement, saying the evidence is “there in black and white, and it is damning.”

The Associated Press contributed to this report.

Advertisement