Meta’s legal troubles with youth safety have been simmering for years, but the latest development finally brought the company face-to-face with a jury. After a lengthy investigation and a trial that pulled back the curtain on Meta’s internal decision-making, jurors ruled that the company deceived teens and their families about the safety of its platforms. The verdict carries a $373 million penalty, a figure that reflects not just the alleged harm but the jury’s belief that Meta misled the public about what it knew and when it knew it. Reports from outlets covering the case note that internal documents and testimony from former employees played a significant role in shaping the jury’s view of Meta’s conduct.
The case itself grew out of a broader wave of scrutiny over how Meta handles youth safety. New Mexico’s attorney general spearheaded one of the earliest and most aggressive investigations, arguing that Meta’s platforms enabled harmful interactions and that the company downplayed the risks to young users. According to reporting from The Maravi Post, jurors found that Meta engaged in what they called unconscionable trade practices, taking advantage of teens’ vulnerabilities and failing to act on known dangers. The state’s legal team leaned heavily on internal records and testimony from former employees who said Meta had long been aware of the harms but resisted implementing safety fixes that might slow growth.
YouTube was also named in the lawsuit, and the jury reached a separate conclusion about its conduct. While the claims against YouTube focused more on the platform’s recommendation systems and the exposure of minors to harmful or addictive content, jurors ultimately found that YouTube did not engage in the same level of deceptive trade practices that they attributed to Meta. The jury did not impose financial penalties on YouTube, but the findings still placed the company under the same spotlight of public scrutiny. The verdict signals that while YouTube avoided the most severe outcome in this particular case, the platform’s design choices and youth‑safety practices remain very much in question.
The trial also intersected with a growing body of evidence from other lawsuits and unsealed filings. TIME reported that internal safety leaders at Meta had raised alarms about everything from adult strangers contacting minors to the company’s unusually high tolerance for accounts involved in sex trafficking. One former head of safety testified that Meta allowed up to sixteen violations for prostitution and sexual solicitation before suspending an account, a threshold far higher than industry norms. Plaintiffs in the multidistrict litigation argue that Meta not only knew about these risks but actively downplayed them to the public and to Congress. These allegations, while not part of the New Mexico case, helped shape the broader narrative that Meta has long prioritized growth over youth safety.
Meta, for its part, has pushed back hard. The company maintains that it invests heavily in safety and that it is transparent about the challenges of moderating billions of pieces of content. After the verdict, Meta announced plans to appeal, arguing that the case mischaracterized its efforts and ignored the complexity of keeping young users safe online. The company has repeatedly said that it works to remove harmful content and that it is committed to improving its systems. Still, the verdict adds weight to the argument that Meta’s internal decisions did not always align with its public messaging.
What happens next is likely to ripple far beyond this single case. The Social Media Victims Law Center notes that more than 1,700 cases are already consolidated in a massive multidistrict litigation targeting Meta, Snap, TikTok, and YouTube for allegedly designing platforms that encourage addictive behavior among teens. Several state attorneys general, including those in Massachusetts and other states, have active suits accusing Meta of intentionally designing features that harm young users. The New Mexico verdict may embolden plaintiffs in those cases, especially given the jury’s willingness to accept arguments that Meta misled the public about safety risks.
The ruling also raises questions about how Meta will navigate the next phase of litigation. An appeal is almost certain, but appeals do not erase the momentum that a jury verdict creates. Other states may now feel more confident pursuing their own claims, and plaintiffs’ attorneys will likely use the New Mexico case as a blueprint. The multidistrict litigation, already sprawling, could become even more consequential if courts begin treating Meta’s internal documents as evidence of a pattern of behavior rather than isolated lapses.
For Meta, the stakes extend beyond financial penalties. The company is already under pressure from lawmakers, regulators, and parents who want stricter guardrails for teens online. A growing number of states are considering legislation that would require platforms to verify ages, limit algorithmic targeting, or default teen accounts to more restrictive settings. If Meta loses additional cases or faces more damaging disclosures, it may be forced to adopt safety measures it has long resisted.
The New Mexico verdict is not the end of Meta’s legal saga, but it is a turning point. It signals that juries are willing to hold tech companies accountable for youth safety failures and that internal debates about growth versus protection may now carry legal consequences. Whether Meta can successfully appeal or shift the narrative remains to be seen, but one thing is clear: this verdict has opened the door for a wave of legal and regulatory challenges that could reshape how social platforms operate for young users.
