Advertisement

California lawmakers want to make social media safer for young people. Can they finally succeed?

A man looks out a window as seen through the leaves of shrubbery outside a home
Samuel Chapman of Los Angeles, whose teenage son died after taking a fentanyl-laced drug he got through Snapchat, is urging California lawmakers to crack down on social media platforms.
(Christina House/Los Angeles Times)
Share via

Samuel Chapman had no idea that drug dealers targeted teens on Snapchat until his 16-year-old son died from a fentanyl overdose.

“We thought it was like a playground for kids and didn’t think of it, as I do now, as the dark web for kids,” the Los Angeles resident said.

In 2021, a drug dealer reached out to his son, Sammy, on the disappearing messaging app and showed the teen a “colorful drug menu” that offered cocaine, Chapman said. After he and his wife fell asleep, the dealer delivered drugs to their house “like a pizza.” Sammy unknowingly took fentanyl and died in his bedroom.

Advertisement

For parents like Chapman, the horrific ordeal underscored social media’s dangerous side. Tech platforms help people keep in touch with family and friends, but they also attract drug dealers, pedophiles and other predators. Plus, social media algorithms can steer young people to posts that could trigger eating disorders or self-harm.

Sammy Chapman with his mom, Laura Berman. Sammy died of a drug overdose in 2021 at age 16.
(Family photo)

Efforts by California lawmakers to crack down on social media’s potential harms stalled in the past amid fierce opposition from multibillion-dollar tech giants, and renewed attempts could suffer the same fate. Last year, the tech industry used its lobbying power in Sacramento to kill one social media bill, and its deep pockets to block another by filing a lawsuit after it was signed into law.

This year, new social media bills face critical votes in the coming weeks as lawmakers race toward the end of the legislative session next month. Senate Bill 680 would allow the government to prosecute platforms for promoting harmful content about eating disorders, self-harm, illegal firearms and drugs like fentanyl. Lawmakers are trying to combat online child sex abuse material too, but they’re bracing for the possibility that tech companies will attempt to block new online safety laws like they have in the past.

“I want social media companies to take responsibility. I want to hold them accountable for the harm that they’re causing for our youth,” said Sen. Nancy Skinner (D-Berkeley), who wrote SB 680.

Advertisement

Social media platforms such as Meta-owned Instagram, Twitter, Snapchat and TikTok have removed some harmful content, but parents, teens and lawmakers say they need to do more to make their services safer for young people.

Chapman and his wife, TV relationship therapist Laura Berman, are among parents supporting legislation that would prohibit social media companies from using a design, algorithm or feature that the companies know or should have known causes a user under the age of 16 to harm themselves or others, develop an eating disorder or experience addiction. Chapman, who sued Snap Inc., the parent company of Snapchat, along with other families, said drug dealers have used the app’s features — location sharing and friends’ recommendations — to find teens.

A man poses sitting at a table with a reflective top
Samuel Chapman sued the parent company of Snapchat, which he says shares blame for his son’s overdose death.
(Christina House/Los Angeles Times)

The legislation is already facing pushback from tech industry groups and privacy advocates. Opponents say some of the bills would run afoul of the 1st Amendment and federal law, and argue that tech platforms already are taking safety issues seriously and offering parents the tools needed to protect their children.

NetChoice, a trade association whose members include Facebook parent Meta, Snap, Twitter, TikTok, Google and Pinterest, said legislators should focus on giving law enforcement the resources needed to fight crimes while tech companies should better educate parents and teens about their tools. Social media apps have parental controls and ways to limit screen time.

“The last thing that we want is the government mandating that technology be responsible for educating and raising our children, because that’s the role of parents and teachers,” said Carl Szabo, NetChoice’s vice president and general counsel.

Advertisement

Snap said it’s taken several steps to combat drug sales on its platform, including using technology to detect drug activity before it’s reported, blocking search results for drug-related terms and making product changes aimed at making it tougher for strangers to connect with minors.

“Fentanyl is finding its way into every major city across America, and our hearts go out to the thousands of families impacted by this growing national crisis. At Snap, we are working hard to stop dealers from abusing our platform,” Rachel Racusen, a spokeswoman for the company, said in a statement.

But with concerns about social media’s impact on mental health piling up, lawmakers are feeling a sense of urgency to act. In May, the U.S. surgeon general said there isn’t enough evidence to determine if social media is “sufficiently safe” for children and adolescents. While there may be benefits to social media, the platforms can also pose mental health risks, a 2021 surgeon general’s advisory said.

Sophie Szew, a Stanford University student and activist, downloaded Instagram when she was 10 years old, unaware of how it would affect her mental health. Instagram requires users to be at least 13, but that hasn’t stopped underage users from signing up.

On Instagram, Szew said she was bombarded with content that promoted and taught eating disorder behaviors. Szew, who was diagnosed with anorexia, said that while social media has allowed her to promote her advocacy work, she still comes across this harmful content.

Social media platforms use algorithms to determine what content users will see first, using a variety of signals such as how long they watch a video and whether they commented, shared or “liked” a post.

Advertisement

“The algorithms are designed to trigger because triggering content is addictive,” said Szew, who supports Skinner’s bill. “We are likely to engage with content that shocks us, and we see this a lot with young people.”

Instagram takes down content the company finds that promotes or encourages eating disorders, but the app allows people to share their own experiences around self-image and body acceptance.

Meanwhile, social media use among young people continues to grow. Google-owned YouTube is the most popular social media platform among U.S. teens, followed by TikTok, Instagram, Snapchat and Facebook, according to a survey from the Pew Research Center. Almost 35% of U.S. teens reported being on at least one of the five apps “almost constantly.”

Knowing that her initial bill faced legal hurdles before the Senate voted on it, Skinner realized she needed to make more changes and decided to introduce another bill, SB 680. One of the changes included how the bill defined harm.

Suicide prevention and crisis counseling resources

If you or someone you know is struggling with suicidal thoughts, seek help from a professional and call 9-8-8. The United States’ first nationwide three-digit mental health crisis hotline 988 will connect callers with trained mental health counselors. Text “HOME” to 741741 in the U.S. and Canada to reach the Crisis Text Line.

Under SB 680, a social media company would have caused a child harm if, as result of the platform’s design, algorithm, or feature, the app sends the child information about how to get a firearm, obtain a controlled substance or die by suicide. If the child gets or takes the controlled substance or kills themselves it would be considered harm.

To avoid liability, social media companies would have 60 days to correct a design, algorithm or feature that they discovered through a quarterly audit could pose a certain level of risk.

Advertisement

Assemblymember Buffy Wicks (D-Oakland) said the onus should be on the companies, not parents, to make tech platforms safer, and regulation could force them to take responsibility. Wicks, a mother of two young children, said she doesn’t want her kids on social media before they’re adults, but she could still face pressure to allow them on the platforms.

“My hope is that when my child is banging down my door every day to get on Snapchat, TikTok or whatever the latest social media craze is, that they’re going to be safer places for kids,” she said.

Wicks introduced Assembly Bill 1394, which requires social media companies to give California users a way to report child sexual abuse material they’re depicted in. The platform would be required to permanently block the material from being viewed. The bill also bars a social media company from “knowingly facilitating, aiding, or abetting commercial sexual exploitation.” A court would be required to award damages between $1 million and $4 million for each act of exploitation that the social media platform “facilitated, aided, or abetted.” The bill overwhelmingly cleared the Assembly and is now in the Senate.

New bills would hold social media companies liable for promoting the sale of fentanyl to youth and facilitating commercial sexual exploitation of minors.

From January to March, Facebook removed 8.9 million pieces of content that violated its rules against child sexual exploitation, most of which was taken down before people reported the content, according to the company. The social network says users who come across photos of videos of children being physically abused or sexually exploited should contact law enforcement, report the content to Facebook, notify the National Center for Missing and Exploited Children and avoid sharing, downloading or commenting on the posts.

Because Wicks’ bill involves child sexual abuse materials, supporters say the content falls outside of the 1st Amendment’s free speech protections. But business and technology groups say the bill still comes with constitutional concerns because companies afraid of getting sued will end up removing more lawful content, including news coverage and historical items.

Federal regulations also shield platforms from lawsuits over certain posts created by users. Part of the 1996 Communications Decency Act, Section 230, has some exemptions and doesn’t apply to federal sex trafficking laws, however.

Advertisement

Even if new social media restrictions are signed into law this year, they may still face hurdles. Last year, NetChoice sued the state to prevent the California Age-Appropriate Design Code Act from taking effect in 2024, stating it would pressure companies to “over-moderate content” and restrict resources that could help teens. Wicks and former Assemblymember Jordan Cunningham (R-Paso Robles) introduced the measure, which includes protections for users under 18 such as having high privacy settings by default.

Efforts to protect children online also are being considered in other states and Congress. Utah Republican Gov. Spencer Cox signed a pair of bills this year that require kids to get consent from their parents before they sign up for platforms like TikTok and Instagram.

Chapman, the parent who lost his son to a fentanyl overdose, wants large social media companies to allow parents to monitor their child’s online activity via third-party safety apps like Bark. The ability to do so can vary depending how much data access the platform will give a third-party app. Drug dealers and teens often use coded emojis to buy and sell drugs, making it tougher for parents to understand what’s happening online.

“My motivation is to save lives,” said Chapman. “I wake up every day since my son died and I’ve devoted myself to warning other parents about this.”

Advertisement