Link to home page
Link to home

News from the open internet

Culture

Landmark YouTube, Instagram addiction verdict puts advertisers on notice

The Current News Break: showing a hand with a gavel on a browser window.

A California jury has found Google and Meta liable for designing addictive platforms that harm children and teenagers — marking the first time a tech company has lost a case centered on product design, not content. This ruling could have sweeping implications far beyond the courtroom. More than 2,000 similar lawsuits are currently moving through U.S. courts, and lawmakers are already framing the decision as a turning point. Massachusetts Sen. Ed Markey said the decision was a sign that “Big Tech’s Big Tobacco moment has arrived.”

For advertisers, the question is no longer whether these platforms are controversial. The ruling sharpens concerns around brand safety and whether these environments pose measurable legal, reputational and ethical risk.

“This ruling shifts the legal context of ads in these platforms,” Dan Gee, chief strategy officer at Media Futures Market, wrote on LinkedIn. “If social media is just a roll-your-eyes acceptable little vice that’s one thing. If the platform has been legally found to be harmful, that’s another.”

From content to design

The California case focused on how the platforms were designed, not the content shown in the feeds — a crucial distinction. Putting a bullseye on the product rather than the content allowed attorneys to skirt Section 230 of the Communications Decency Act of 1996. A Stanford Law professor previously described those protections as making platforms “absolutely immune from lawsuits related to content authored by third parties.”

The lawsuit centered around a 20-year-old woman named Kaley who first started using YouTube at 6 years old and Instagram at 9 years old. She testified that excessively using those apps led to depression, anxiety and suicidal thoughts. A jury awarded her $6 million in damages.

Joseph VanZandt, part of the plaintiff’s team of lawyers, framed the verdict as industry-wide accountability.

“For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features,” VanZandt said in a statement. “Today’s verdict is a referendum, from a jury to an entire industry that accountability has arrived.”

Both companies signaled they will challenge the ruling. A spokesperson for Meta said the company would evaluate future legal options. Google spokesperson José Castañeda said in a statement that the case “misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”

A pattern of rulings

The California decision follows another major verdict earlier this week, in which a New Mexico jury ordered Meta to pay $375 million for failing to protect minors from predators and misleading users about platform safety.

Plaintiffs in multiple lawsuits submitted internal documents as evidence that Meta prioritized growth and ad revenue over user safety for years.

One Meta memo submitted in the California trial stated,“If we wanna win big with teens, we must bring them in as tweens.” Another memo found that 11-year-olds were four times as likely to keep coming back to Instagram compared to other apps.

Facebook co-Founder Mark Zuckerberg testified in the trial, saying on the stand he didn’t understand why someone would keep going back to something they’re addicted to.

“If people feel like they’re not having a good experience, why would they keep using the product?” Zuckerberg said.

Advertisers: Brand safety and the moral compass

It’s unclear how these court decisions will impact ad spend on Google and Meta. Both tech giants have been through multiple scandals for more than a decade — including the 2018 Cambridge Analytica scandal, which exposed the misuse of data from up to 87 million Facebook users.

That’s partly because ethics alone rarely move budgets, Shirley Marschall, a former agency executive, told The Current. “The problem with the moral compass is — it’s not enough on its own.”

Despite repeated scandals, advertiser behavior has followed a familiar pattern, Marschall has argued.

“These platforms can deliver incredibly effective reach and impact for advertisers, so they often merit inclusion on the plan,” Marschall said. “But they become a far more confronting choice when you consider their societal impact.”

That tension leaves advertisers with few levers beyond brand safety — and even that has limits. “[Brand safety] worked well with X but less so with Meta and YouTube,” she said.

Still, platforms like Meta and YouTube remain too effective to ignore. WARC projected that ad spend on social media platforms would rise 15% worldwide in 2025, with Meta estimated to rake in 60% of the $306 billion poured into social by advertisers. YouTube reported over $40 billion in ad revenue in 2025, a more than 10% increase from 2024.

If the ruling provokes more legal scrutiny on product design and potential harm, advertisers may need to rethink how they define performance and risk, Gee argued.

“Appearing within a system designed to maximize compulsion carries a signal about what you are willing to fund and be associated with,” he said.

“This is not about removing social from the plan. It is about updating the framework. Performance viewed purely through efficiency metrics looks incomplete if outcomes are partly driven by engineered compulsion.”