Things are set to change for social media giants including Meta and Google, after two separate juries found them guilty of harm against children, and thousands of similar cases and compensation claims are set to be triggered.
In the United States this morning, a California jury found Meta and Google are responsible for the depression and anxiety of a woman, now 20, who used social media compulsively as a child.
The case represented a test case regarding the addictive nature of these platforms and the content that is consequently served up to kids. The jury found the companies were negligent in causing harm to the plaintiff, and that the companies had failed to adequately warn users about the dangers they posed to kids.
Separately in New Mexico on Tuesday, a jury found Meta had failed to protect children from child predators across Facebook and Instagram. Meta was ordered to pay $375 million in damages. Meta was found to have misled consumers about safety on their platforms, including by flouting New Mexico’s own state consumer protection laws.
These two verdicts mark the first time that juries have found tech companies liable for the dangers kids experience after using social media extensively.
While the compensation figure from the California case is small — US$3 million will be awarded to the woman, the majority (70 per cent) to be paid by Meta — the case is set to trigger thousands of similar actions and compensation claims.
Known only as Kaley, the plaintiff in the California case shared details during the trial of using YouTube from the age of 6 and Instagram from 9. She said she was quickly using the platforms “all day long”. Her lawyers argued that the platforms were intentionally designed to be addictive.
Meta and Google responded by delving into Kaley’s experiences growing up, including bringing up medical records to examine her family environment and a difficult home life, to try to claim their platforms can’t be blamed for complex mental health issues.
Lawyers for the plaintiff said the judgment is about much more than Kaley’s experience.
“Today’s verdict is a referendum — from a jury, to an entire industry — that accountability has arrived,” the statement read.
“This verdict is bigger than one case. For years, social media companies have profited from targeting children while concealing their addictive, dangerous design features. Today’s verdict is a referendum — from a jury, to an entire industry — that accountability has arrived.”
The lawyer has also been comparing the case as the accountability that ultimately came for tobacco companies.
During the trial, internal Meta documents revealed research on what Meta knew about the age of users and intentions behind how their platforms are designed. One document read: “If we want to win big with teens, we must bring them in as teens, we must bring them in as tweens.” Another document showed internal research from Meta finding that 11-year-olds were four times as likely to return to Instagram than to competing apps.
These documents were used successfully to show the jury that design features such as infinite scroll, autoplay, and even notifications across platforms were designed to “hook” young users.
Lawyers for the plaintiff also claimed YouTube intentionally tried to lure and keep children on the platform because they could charge advertisers more for ads targeting children.
The case is just the start. The apps may be facing significant compensation claims — a process already compared to what tobacco companies went through for the harm they caused.
The case also sparked conversations about whether social media is “addictive”, with “social media addiction” and whether someone can be addicted in similar ways to being addicted to nicotine and other substances, being debated by researchers globally. Instagram’s CEO Adam Mosseri had testified in the California case that, “I think it’s important to differentiate between clinical addiction and problematic use.”
But it’s certainly highlighted how platforms can contribute to at least problematic use, including via social media scrolls that typically never finish, users being algorithmically served content that is identified as relevant (and therefore constantly appealing) to their experiences. Recommended content and push notifications are also being examined as potential contributors.
This case also validates the experiences and concerns of many parents globally, who have taken on the burden of managing their kids’ screen use despite tech companies introducing features to make their platforms increasingly appealing and sticky. Numerous studies find that features such as push notifications, infinite scroll, and autoplay can override an individual’s control.
“Stop blaming the parents, it’s on you,” a parent who lost her teen to suicide said outside the California courtroom today. “We know this is the long game. We’re heading to DC with this verdict, and we’re demanding legislation and protections. We don’t want hearings or loopholes anymore. Enough is enough.
“Big tech is predator number one in this world now.”
Indeed, the verdicts validate what many adult users of social media feel: that it’s not merely a matter of failed willpower and not only a matter of compelling content, but that platform design elements are contributing to keeping you hooked in.

