How Companies Can Make Social Media Less Addictive for Teens

Research points to how companies could make social media less addictive for teens

Research points to how companies could make social media less addictive for teensImage Credit: NPR News

Key Points

  • By a Senior Financial Correspondent
  • Symptoms of Compulsion: In a study of 11- and 12-year-olds, Dr. Nagata found that 16% reported trying and failing to reduce their social media use, a classic indicator of lost control. Nearly a quarter (23%) admitted to spending a significant amount of time just thinking about their social media apps.
  • Direct Mental Health Impact: The research established a direct correlation between these addictive usage patterns and negative mental health outcomes. After accounting for their initial mental state, children exhibiting compulsive use were found to be more likely to develop depression, attention problems, and other behavioral issues one year later.
  • Escalating Downstream Risks: The consequences extend beyond mood disorders. Dr. Nagata’s work also linked addictive social media use to a higher risk of sleep disturbances, suicidal behaviors, and even experimentation with tobacco, alcohol, and marijuana.
  • Overhaul Core Engagement Mechanics: Researchers advocate for restricting or disabling features like infinite scroll and hyper-personalized algorithmic feeds for minor accounts. These tools are designed for maximum engagement and are seen as primary drivers of compulsive use.

Research points to how companies could make social media less addictive for teens

By a Senior Financial Correspondent

A pair of landmark court verdicts against Meta and Google have thrust the technology industry into a new and perilous legal frontier, shifting the focus from the content on social media platforms to their fundamental design. The rulings, which found the companies liable for creating deliberately addictive products harmful to children, signal a mounting financial and regulatory risk that challenges the core engagement-based business models of the world's largest tech firms.

The legal blows landed in rapid succession. In California, a jury held Google and Meta responsible for the depression and anxiety of a user, concluding that platforms like Instagram and YouTube were engineered to be addictive. A day earlier, a New Mexico jury determined that Meta's platforms violate state laws and endanger children’s mental health. While both companies have stated their disagreement and intent to appeal, the verdicts lend legal weight to a growing body of scientific research that identifies the specific mechanisms driving compulsive use in adolescents and offers a blueprint for how they could be dismantled.

These developments suggest that the era of self-regulation may be drawing to a close, with courts and lawmakers increasingly prepared to mandate changes to the very architecture of social media. For investors and company leaders, the question is no longer just about content moderation, but about the potential for forced redesign of products that generate billions in revenue.

The Science Behind the Verdicts

For years, the debate around teen mental health and technology centered on screen time. However, recent scientific inquiry has evolved to focus on a more nuanced and alarming metric: compulsive use that mirrors the clinical symptoms of addiction. This research provides the evidentiary backbone for the legal arguments now succeeding in court.

Dr. Jason Nagata, a leading pediatrician and researcher at the University of California San Francisco, has conducted studies that quantify these behaviors. His work reveals that even among underage users, the patterns of engagement are cause for significant concern.

  • Symptoms of Compulsion: In a study of 11- and 12-year-olds, Dr. Nagata found that 16% reported trying and failing to reduce their social media use, a classic indicator of lost control. Nearly a quarter (23%) admitted to spending a significant amount of time just thinking about their social media apps.

  • Direct Mental Health Impact: The research established a direct correlation between these addictive usage patterns and negative mental health outcomes. After accounting for their initial mental state, children exhibiting compulsive use were found to be more likely to develop depression, attention problems, and other behavioral issues one year later.

  • Escalating Downstream Risks: The consequences extend beyond mood disorders. Dr. Nagata’s work also linked addictive social media use to a higher risk of sleep disturbances, suicidal behaviors, and even experimentation with tobacco, alcohol, and marijuana.

These findings reframe the issue from one of simple distraction to one of measurable, predictable harm, strengthening the position of plaintiffs and regulators who argue that the platforms are not merely passive conduits for content but active agents in shaping user behavior.

A Blueprint for Safer Design

While tech companies have introduced "friction" features—such as time-limit reminders and the option to disable notifications—researchers argue these are insufficient for minors, whose developing brains are uniquely susceptible to manipulative design. The adolescent brain, characterized by a "hypersensitive, social brain and a very weak prefrontal cortex," as described by Mitch Prinstein of the University of North Carolina at Chapel Hill, struggles to resist the constant digital rewards.

Scientists have pinpointed specific, actionable changes that could fundamentally reduce the addictive potential of these platforms for young users. These proposals form a potential roadmap for both voluntary corporate reform and future regulation.

  • Overhaul Core Engagement Mechanics: Researchers advocate for restricting or disabling features like infinite scroll and hyper-personalized algorithmic feeds for minor accounts. These tools are designed for maximum engagement and are seen as primary drivers of compulsive use.

  • De-weaponize Notifications: The constant stream of 'likes' and other notifications should be severely limited or turned off by default for teens. This feature is seen as particularly harmful, as it directly taps into the adolescent need for social validation, creating a powerful and irresistible feedback loop.

  • Implement Time-Based Guardrails: A critical recommendation is to automatically disable notifications during school hours and, most importantly, at bedtime. Research overwhelmingly shows that platform usage interferes with sleep, which in turn exacerbates mental health symptoms.

  • Default to Maximum Privacy: For all users under 18, privacy settings should be set to the highest level by default. This would prevent their data from being used to build the personalized profiles that allow algorithms to serve them an endless stream of engaging, and potentially harmful, content.

The Regulatory and Business Horizon

The pressure to implement such changes is no longer purely academic. The Kids Online Safety Act (KOSA), a bipartisan bill that passed the U.S. Senate in 2024, incorporates many of these design-centric proposals. Though its progress has stalled in the House, the recent court verdicts could provide new momentum for federal legislative action.

If enacted, KOSA or similar legislation would represent a seismic shift for the industry. It would force companies to create fundamentally different versions of their products for minors, potentially impacting key performance indicators like daily active users and time spent on the platform—metrics closely watched by Wall Street.

The industry's long-standing reliance on parental controls as the primary safety mechanism is also being challenged. As Prinstein notes, "Most adults would be shocked if they looked through a children's feed," which is often populated with sexualized content or material promoting self-harm. This underscores the argument that the burden of safety cannot rest on parents alone and that platforms have a duty of care to build a safer environment by design.

For Google, Meta, and their peers, the path forward is narrowing. The recent legal defeats, combined with a robust body of scientific evidence and looming regulatory threats, have created a powerful convergence of forces. The industry now faces a stark choice: proactively re-engineer its products to mitigate harm to its youngest users, or face a future of escalating legal battles and government-mandated redesigns that could prove far more costly to their business and reputation.

Source: NPR News