Instagram CEO Adam Mosseri took the witness stand in Los Angeles court to defend his platform against claims that it deliberately creates addiction-like behaviors in children and young adults, arguing that clinical addiction to social media is fundamentally impossible.
The testimony comes as part of a high-profile trial addressing what plaintiffs characterize as "social media addiction" among minors, placing Mosseri at the center of an increasingly intense global debate over Big Tech's responsibility for youth mental health outcomes. The case represents one of the most significant legal challenges to date against social media platforms' design practices and their impact on developing minds.
The Case Against Social Media Platforms
The Los Angeles proceedings form part of a broader legal movement targeting social media companies' business models and design features. Plaintiffs argue that Instagram and other platforms deliberately incorporate "addictive design" elements specifically intended to maximize user engagement, particularly among vulnerable young users.
This legal challenge occurs amid mounting scientific evidence about social media's impact on youth mental health. Dr. Ran Barzilay's research at the University of Pennsylvania has demonstrated clear links between early smartphone exposure and sleep disorders, weight problems, and diminished cognitive abilities. Children exposed to devices before age 5 show significantly higher rates of sleep disruption and decreased physical activity levels.
Recent comprehensive studies reveal that 96% of children aged 10-15 use social media platforms, with 70% experiencing harmful content exposure and over 50% encountering cyberbullying. These statistics have provided compelling evidence for policymakers worldwide considering age-based restrictions on platform access.
Mosseri's Defense Strategy
During his testimony, Mosseri distinguished between clinical addiction—a medically recognized condition with specific diagnostic criteria—and what he characterized as "problematic use" of social media platforms. This semantic distinction forms the cornerstone of Instagram's legal defense strategy.
"There is a fundamental difference between clinical addiction and problematic use patterns. Our platform does not meet the medical criteria for addiction, which requires specific physiological and psychological dependencies."
— Adam Mosseri, Instagram CEO
The Instagram chief's testimony comes at a critical moment for the tech industry, as regulatory pressure intensifies globally. The European Commission recently found TikTok in violation of the Digital Services Act through "addictive design" features including unlimited scrolling, automatic video playback, and personalized recommendations designed to maximize user dependency over wellbeing.
Global Regulatory Momentum
Mosseri's court appearance coincides with unprecedented international coordination targeting social media platforms' youth impact. Spain has implemented the world's most aggressive regulatory framework, including complete under-16 social media prohibitions and criminal executive liability for platform leadership—a model now spreading across Europe.
Australia's successful implementation of under-16 social media restrictions eliminated 4.7 million teen accounts since December 2025, proving that technical age verification is achievable with government commitment. This model is being closely studied by European nations including Greece, Slovenia, Germany, France, Denmark, and Austria, all considering similar measures.
The coordinated European response represents a fundamental shift from industry self-regulation to government enforcement with meaningful legal consequences. Criminal executive liability frameworks create personal imprisonment risks for tech leadership, representing the most significant challenge to platform impunity in internet history.
Industry Resistance and Counter-Arguments
The tech industry has responded with escalating resistance to regulatory pressure. Elon Musk has characterized Spanish regulations as "fascist totalitarian" measures, while Telegram's Pavel Durov has sent mass alerts warning users about "surveillance state" applications. This coordinated opposition is being used by governments as evidence supporting regulatory necessity.
Platform companies argue that their features represent standard industry practices designed to enhance user experience rather than create harmful dependencies. They contend that age verification requirements raise significant privacy concerns, particularly when biometric authentication becomes necessary for effective enforcement.
Scientific Evidence and Youth Mental Health
The legal proceedings occur against a backdrop of mounting scientific evidence linking excessive screen time to mental health deterioration among young people. Large-scale U.S. research has revealed that children spending four or more hours daily on screens face 61% increased depression risk, establishing clear causation between excessive digital device usage and psychological disorders.
The research identifies two primary mechanisms: sleep pattern disruption through blue light melatonin suppression and decreased physical activity displacement. These findings provide scientific foundation for policy interventions targeting platform design and access restrictions.
Mental health professionals worldwide report significant increases in anxiety, depression, and attention difficulties among children and teenagers. The "therapeutic revolution of 2026" has emphasized prevention-first approaches addressing social and cultural factors alongside individual symptoms, recognizing that platform design choices significantly influence psychological development.
Implementation Challenges and Privacy Concerns
The testimony highlights complex implementation challenges facing both platforms and regulators. Effective age verification requires sophisticated technical solutions, potentially including biometric authentication systems that raise surveillance concerns among privacy advocates.
Cross-border enforcement presents additional complications, requiring unprecedented international cooperation between national authorities. The ongoing global memory crisis, with sixfold semiconductor price increases affecting verification infrastructure, creates technical bottlenecks that may persist until 2027.
Economic and Social Implications
The trial's outcome could establish precedents affecting platform design practices globally, potentially requiring fundamental changes to infinite scroll, autoplay, and algorithmic curation across all major social media companies. Compliance costs may advantage large platforms over smaller competitors, raising competition concerns.
Success of regulatory initiatives could trigger worldwide adoption of criminal liability frameworks and age restrictions, while failure might strengthen industry anti-regulation arguments. The stakes include fundamental questions about democratic governance versus multinational platform power, childhood development in the digital age, and corporate accountability for societal outcomes.
Looking Forward
The Los Angeles trial represents a critical test case for 21st-century technology governance, with implications extending far beyond Instagram's specific practices. The proceedings will help determine whether democratic societies can effectively regulate multinational technology platforms while balancing child protection with digital rights and innovation.
As Mosseri's testimony continues, the global community watches closely for precedents that may reshape the relationship between technology companies and society. The outcome could influence regulatory approaches affecting millions of users worldwide and establish new standards for corporate responsibility in the digital age.
The case underscores the urgent need for evidence-based approaches to platform regulation that acknowledge both the benefits of digital connectivity and the documented risks to developing minds. Whatever the trial's outcome, it marks a pivotal moment in the ongoing debate over technology's role in shaping human behavior and social development.