Recent court documents from a major California lawsuit offer documented evidence of Instagram’s internal strategies for young users. Lawyers for the plaintiffs argue that Meta made tracking and boosting teen usage time a top priority, even with clear warnings about mental health risks. These details come from emails, memos, and data pulled from the Social Media Youth Harm Litigation, which includes cases tied to JCCP 5255.
The bellwether trial centers on a plaintiff now 20 years old, identified as Kaley, whose onset of symptoms began as a minor. She claims Instagram fueled her severe depression, eating disorders, and suicidal thoughts during her teen years. Kaley faced constant bullying and financial sextortion from malicious actors who got private photos and then demanded money or more images.
Her family urgently petitioned Meta for intervention, but it took two weeks to act, exacerbating her psychological distress each day. She shared in court how the app took over her life, monopolizing her cognitive focus with endless scrolling that felt impossible to stop. This bellwether trial tests key evidence for hundreds of similar cases in Los Angeles County Superior Court.
Company records show daily use grew from around 40 minutes per user in 2023 to a projected 46-minute milestone by 2026, with teen numbers marked as major wins. A clear 2017 email from a top product manager read, “Our overall company goal is total teen time spent, and Mark has decided that the top priority for the company in the first half of 2017 is teens.” Notes from 2018 pointed out tweens, kids aged 10 to 12, demonstrated the highest retention rates.
Nick Clegg, Meta’s President of Global Affairs, wrote that age checks were “basically unenforceable” since kids often lied. These facts present a detailed chronology of a company chasing youth growth while its own studies raised red flags on harm.
Meta’s 2015 figures put under-13 U.S. kids on Instagram at about 4 million, or 30% of 10-to-12-year-olds. This is inconsistent with Mark Zuckerberg’s 2024 Congress words on strict bans for kids under 13. He said they enforced rules when they spotted violators. Age prompts came slow: new users in 2019, all users by August 2021, pushed by rules like the UK’s Age Appropriate Design Code. Still, 2026 files aim for top teen spots in the U.S. and worldwide.
In cross-examination, Zuckerberg spoke on kids faking ages, saying, “I always wish we could have gotten there sooner” for safety fixes. He faced questions on gaps between public promises and private goals. Meta’s Stephanie Otway, in defense filings, said Kaley’s family issues came first, not the app. Snap and TikTok settled their respective litigations, but Meta and YouTube push back, pointing to outside causes for teen woes. App tricks like endless feeds and alerts work as digital casinos, using random rewards to keep users hooked, according to plaintiffs’ expert witnesses.
Duty of Care Argument
Plaintiffs say Meta failed its duty of care by building tools known to hurt kids, much like faulty products sold to buyers. They blame algorithms that push harmful content to boost time spent, not safety. This view treats Instagram as a risky item needing built-in guards, like auto-limits for youth accounts. Such steps could have cut harms from the start.
Algorithmic Design vs. Section 230
Meta banks on Section 230 protections that block suits over user posts. The other side hits “product design” choices, like hooks in feeds and alerts, as fair game. They call these no mere tweaks but planned traps that dodge old shields. Judges now probe this line in kid harm suits across the country.
Insights from 2025 Reports
2025 reviews, including Meta’s “Safety vs. Profit” checks, showed growth aims beat harm fixes most times. One key report said clear choices made Instagram unsafe for teens, going against safety vows. Whistleblowers used these to back claims now in court. The files stress how profit pulled ahead of young user well-being year after year.
Technological Safeguards: Too Little, Too Late?
Meta’s Family Center tools let parents check teen app time and content, but court audits from 2025 found few teens turned them on. Internal notes said under 5% of young users had supervision active, as minors utilized simple bypass mechanisms. Rollouts lagged harms by years, with full checks only after big pushback. Critics say these fixes treat symptoms, not the app’s core pull on young minds. Real change needs a default on settings from day one.
Broader Industry Impact
This trial shapes talks on the U.S. Kids Online Safety Act (KOSA), which eyes tough age checks and harm blocks for all apps. Lawmakers cite it to speed KOSA votes in Congress. YouTube monitors the proceedings closely; an adverse jury verdict on Zuckerberg might spark their deal. Other platforms, such as TikTok, already settled, hinting at a shift. Wins here could force all tech firms to weigh kids first in designs.
The verdict, expected in late March 2026, will serve as a legal ‘north star’ for the remaining 1,500 cases in the JCCP 5255 consolidation. It marks the first time a jury will decide if Section 230 immunity extends to the underlying architecture of engagement-based algorithms.
This big JCCP 5255 fight with hundreds of families will likely define the legal landscape for tech accountability for years to come. Parents want better controls as teen screen time climbs. Early 2026 rulings may bring new federal rules or big changes in apps. The coming weeks will show if courts put youth safety over endless growth.