Silicon Valley is finally being forced to answer for what it built
Dr Stephen Whitehead
- Published
- Opinion & Analysis

A landmark verdict against Meta and YouTube proves that social media was built without a conscience — and society has paid the price, writes Dr Stephen Whitehead
On 25 March 2026, a Los Angeles jury delivered what many are already calling Silicon Valley’s ‘Big Tobacco’ moment. Meta and YouTube were found liable in a first-of-its-kind lawsuit, with the jury concluding that the companies designed their platforms to hook young users without concern for their wellbeing.
Meta was held 70 per cent responsible for the harm caused and YouTube the remaining 30 per cent, with total damages — compensatory and punitive — amounting to US$6 million.
The verdict came hard on the heels of a separate jury in New Mexico, which found Meta liable for failing to protect children from online predators and sexual exploitation, ordering the company to pay $375 million in civil penalties.
The defendants, predictably, disagree. Meta said it will appeal, insisting that “teen mental health is profoundly complex and cannot be linked to a single app.”. Google, for its part, argued that the case “misunderstands YouTube, which is a responsibly built streaming platform.”
In my view, these are the words of corporations whose lawyers have coached them well. They are not the words of organisations that ever genuinely asked what their products were doing to the young minds consuming them.
That question — what are we doing to human beings? — was never, it would seem, seriously posed when these platforms were being built. That is not a legal argument. It is a sociological one. And it is the more damning of the two.
Mark Zuckerberg launched Facebook in 2004 as a Harvard dorm-room project. He was, by any measure, a technical prodigy. What he was not — and this matters enormously — was a sociologist, a psychologist, or an ethicist. That is not a criticism of Zuckerberg personally. He built something extraordinary. The issue lay not in his intentions but in the absence of any framework, any professional oversight, any systematic understanding of how his creation would interact with human psychology at scale. When Facebook grew from a campus curiosity into a global nervous system used by billions, no one with expertise in the human condition was in the room asking the hard questions.
The verdict validated the plaintiff’s lawyers’ approach of shifting the legal target; instead of focusing on the content people see on social media, the case put the spotlight on how social media services were designed. This is precisely the right frame. The harm was not accidental. Lawyers pointed to specific design features — infinite scroll, autoplay, and algorithmic notifications — engineered, reportedly, to “hook” young users whose developing brains were entirely unequipped to resist them. The platforms were built as behavioural traps, and the people who built them apparently knew it.
The litigation has drawn comparisons to the legal crusade against Big Tobacco in the 1990s, which forced that industry to stop targeting minors with advertising. The analogy holds, but only up to a point. Tobacco companies at least understood their product’s chemistry. Social media companies understood the psychological architecture of addiction — and deployed it anyway, at industrial scale, on children. The difference is one of deliberate sophistication that Big Tobacco, for all its cynicism, never quite matched.
There is a broader lesson here, and it extends well beyond the courtroom. We have developed a habit, in the digital age, of building enormously powerful tools and releasing them into society before we have any idea what they will do to it. Social media is merely the most visible example. The pattern is consistent: a technology emerges, capital pours in, scale is achieved with breathtaking speed, and only years later — when the damage is measurable, when the children are in therapy, when the juries are deliberating — does society attempt to impose any kind of order.
What we need, and what we conspicuously lack, is not more legislation drafted by politicians who struggle to understand the technology they are trying to regulate. We need the systematic involvement of experts in the human condition — psychologists, sociologists, developmental scientists — embedded in these organisations from inception, holding genuine authority rather than decorative titles. The same logic that compels us to involve ethicists in the development of nuclear technology, or biologists in the assessment of pharmaceutical risk, should compel us to involve human scientists in the development of any platform designed to colonise human attention.
Because that, ultimately, is what social media does. It colonises attention. It monetises it. As one attorney put it: “For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features.” This is not the language of unintended consequences. It is the language of calculated extraction — of human vulnerability treated as a resource to be mined.
In a capitalist system, the drive to build profitable products is not only understandable but structurally necessary. That is not the argument. The argument is that profit and ethics are not, as the technology industry has long behaved, mutually exclusive. A platform can be commercially successful and psychologically responsible. But only if the people building it ever stopped to ask whether it should be — and only if there were qualified voices in the room whose job it was to insist on an answer.
This verdict may influence the outcome of some 2,000 other pending lawsuits. There will be appeals, delays, and further rounds of corporate denial. The dam, however, has broken. And when the full legal reckoning eventually arrives, the question that will define the next generation of digital platforms will not be how do we maximise engagement? but what are we doing to the people we serve?
It is, frankly, a question that should have been asked 20 years ago. The cost of not asking it has been measured, now, in courtroom damages. The deeper cost — in young minds, fractured development, and a generation of children who grew up inside machines designed to addict them — will take far longer to calculate.

Dr Stephen Whitehead is a gender sociologist and author recognised for his work on gender, leadership and organisational culture. Formerly at Keele University, he has lived in Asia since 2009 and has written 20 books translated into 17 languages. He is based in Thailand and is co-founder of Cerafyna Technologies.
READ MORE: ‘Social media giants hit with $6m verdict in landmark youth harm case‘. A U.S jury has found Meta and YouTube liable for harm linked to their platforms in a first-of-its-kind trial over the impact of social media on children as the UK moves to test social media bans and curfews for teenagers.
Do you have news to share or expertise to contribute? The European welcomes insights from business leaders and sector specialists. Get in touch with our editorial team to find out more.
Main image: www.kaboompics.com/Pexels
RECENT ARTICLES
-
President Trump is the product of a constitution stretched beyond its limits -
How Japan’s beer-and-ski city became a global testbed for green AI -
The dating imbalance: why highly educated women are struggling to find partners -
New Hindu Kush Himalaya glacier reports warn of deepening risk to Asia’s water security -
First Adolescence, now Inside the Manosphere. How do we protect boys from misogynistic alpha male influencers? -
NATO reluctance signals limits on U.S. leadership -
Iran, nuclear proliferation and the hard choices facing democracies -
When AI customer service fails, don’t blame technology — it’s leadership at fault -
SUCCESS London conference highlights challenge of life after cure for brain tumour survivors -
A new generation of disability rights leaders is reshaping Europe -
Trump hasn’t broken America — he’s exposed what it really is -
AI is rewriting Europe’s networks from the inside out — and the continent isn’t ready -
Europe’s new gender strategy may be solving yesterday’s problems -
Why Britain still needs reporters in the courtroom -
Rivers run deeper than we think -
Spain’s rocket builder just landed €180 million — and Europe’s case for space sovereignty just got harder to ignore -
Why jobs and housing must be solved together to deliver real disability inclusion -
The new gender divide is already reshaping Europe’s future leaders -
The Arctic’s unfinished cold war -
Highway robbery: how the UK’s post-Brexit electric car policy blew a fuse -
Nokia built the brains for the AI network revolution — so why is American capital leading the charge? -
What the UK SEND reform whitepaper means and what it might take to deliver it -
Europe cannot call itself ‘equal’ while disabled citizens are still fighting for access -
Is Europe regulating the future or forgetting to build it? The hidden flaw in digital sovereignty -
The era of easy markets is ending — here are the risks investors can no longer ignore


























