Social Media’s Dark Reign Over CA Kids’ Mental Health May Be Ending

On Monday, a bill looking to address this all-to-common and deadly serious phenomenon found support as it passed a key hurdle in the California state legislature
214

Many parents of California pre-teens can likely recall the immediate and thoroughly modern change that overtakes their child in the moments that they are handed a smartphone of their very own. A once vocal, perhaps even polite, child soon devolves into a TikTok addict or begins again (and again) trying for the perfect selfie for the ‘gram—or maybe an inner Twitter troll emerges.

On Monday, a bill looking to address this all-to-common and deadly serious phenomenon found support as it passed a key hurdle in the California state legislature. Assembly Bill 35, authored by San Luis Obispo Republican Assemblyman Jordan Cunningham, seeks to hold social media companies like Facebook and Instagram parent co. Meta Platforms responsible for what his bill says is harming children. 

According to Cunningham’s proposal, kids have unknowingly become mentally dependent on the products these companies have built into their platforms, and the bill asserts that the leaders of these companies knew full-well their highly addictive nature.

If signed into law, Assembly Bill 35 will cost the social media giants—which, of course, are mostly based in California—up to $25,000 per violation that is brought forward. The parameters here include any physical, mental, emotional, developmental, or material harm a California minor experiences as a result of using a social channel. To bring a case, the minor must also want to stop or reduce their time spent on the platforms but be unable to because of the platform’s addictive nature. 

“The era of unfettered social experimentation on children is over and we will protect kids,” Cunningham said of his bill as it passed in the Assembly this week.

While the bill aims to encourage social platforms to curb the growing issue of social media addiction among California’s kids and teens, the opposition to its hefty fines is strong within the business and tech community. 

The outcry over the idea of the bill passing the state Senate, its next hurdle, and becoming California law has centered on the fact that it would effectively force the platforms to prohibit minors across the Golden State from joining. Members of TechNet, a national network of tech world CEOs and their underlings, responded fiercely to the news of the bill clearing its first obstacle in Sacramento.

“Social media companies and online web services would have no choice but to cease operations for kids under 18 and would implement stringent age-verification in order to ensure that adolescents did not use their sites,” the members explained in a letter to the state Assembly. “There is no social media company let alone any business that could tolerate that legal risk.”

This bill may seem wildly specific and targeted, as if it goes to lengths to corner social platforms in a precarious position. But the fact is, the addictive qualities of these platforms and products are baked in; it’s a feature, not a bug, and it is well-known to the companies and teams that actively seek to create this type of addictive presentation. The goal, of course, is to hook them while they’re young into hours and days scrolling through videos and ad-heavy feeds. 

Despite the harsh parameters introduced by Assembly Bill 35, no one here needs to weep for Mark Zuckerberg or Evan Spiegel or um, Elon Musk, maybe—because Bill 35 points to a clear path of compliance that allows these companies to stay alive and, of course, maintain profit streams off California’s youth.

If Assembly Bill 35 survives the coming weeks of state Senate debate, it will likely become the law of the land on January 1. Social platforms will then have until April 1 to clean up their acts and remove the addictive features if they want to avoid damages. Or, if the leadership has their social platforms conduct regular audits of these addictive products, they’ll be safe from the legal hellscape currently flashing before their eyes. Presumably, this task can’t be so difficult, as these massive firms have armies of engineers and product leaders to ensure compliance. All will be well until the earnings report arrives. 

This potential legislative and legal blow to Big Social comes just as TikTok, Snap and Meta are preparing to head to court over some very dark lawsuits on the matter in question. In January, the mother of Selena Rodriguez, an 11-year-old from Enfield, Conn., filed a suit against Snap, Inc., and Meta after the child’s death by suicide in 2021; this tragedy had followed a long struggle with an apparent social media addiction. Tammy, her mother, says that features on Snapchat and Instagram led to addiction so severe she needed mental health treatment, according to a BBC News report.

Selena was also coerced on a social platform into sharing explicit images of herself, Tammy Rodriguez said, an event that occurred while she was sleep-deprived and in a deep depression. The images then leaked to her classmates. Soon came Selena’s downward spiral. 

The lawsuit, filed in California, alleges that Snap and Meta “knowingly and purposefully” created harmful products geared toward a significant number of underage users, stating that the “defendants intentionally created an attractive nuisance to young children but failed to provide adequate safeguards from the harmful effects they knew were occurring on their wholly-owned and controlled digital premises.”

That sentiment echoes what Meta whistleblower Frances Haugen was telling Congress in October. Ex-product manager Haugen went wide to reveal that the social giant knew its products are harmful—citing instances of kids who use Instagram developing negative body images—but she told lawmakers that it was clear that Meta leadership just likes money too much to do anything about this. Quotes including, “We make body image issues worse for 1 in 3 teen girls,” and, “Teens blame Instagram for increases in the rate of anxiety and depression,” were found in Meta’s internal research

Zuckerberg, Meta’s founder and CEO, was quick to push back on this notion, saying that Haugen’s version is a mischaracterization and that his company definitely cares “deeply about issues like safety, well-being and mental health.” He then mentioned that Meta is working on this “kind of age-appropriate experience with parental controls for Instagram.” As of now, “Instagram Kids” has yet to surface

But this dire problem is not just Meta’s to tackle. The parent company of its rising rival, TikTok, is now facing a lawsuit in Pennsylvania after a young girl’s untimely and tragic death. Tawainna Anderson said her daughter, Nylah, died in December after taking part in the “Blackout Challenge.” The viral TikTok challenge encouraged users to choke themselves with household objects until they pass out, for an ​​asphyxia rush. Anderson said that in December, she found her daughter hanging from her bedroom close; the girl died in a hospital five days later. 

Like Anderson, social media users with common sense know that these viral dances and often dangerous challenges are core to the TikTok experience. And it’s clear after all these years that the top-secret and ever-shifting algorithms and product experiences are what keep users coming back to Facebook, Instagram, Snap, Twitter, etc. Whatever comes next will iterate on today’s problematic features—for better or worse.

Whether tech companies are legally responsible for what happens next is unanswered. Soon, lawmakers, judges and juries will have to decide if announcing one’s deep concern about the public’s safety and mental health is as far as any company really has to go.  


Stay on top of the latest in L.A. news, food, and culture. Sign up for our newsletters today.