Facebook Founder Takes The Stand In Meta Suit

Facebook Founder Takes The Stand In Meta Suit

By Aaron Miller-

Angeles —, Mark Zuckerberg, (pictured)the founder and CEO of Meta Platforms, appeared on the witness stand this week for what legal scholars are calling one of the most consequential civil trials in Silicon Valley history.

The lawsuit — filed in Los Angeles County Superior Court under the case number JCCP 5255 — was brought by a woman identified only by her initials, K.G.M., now 20, and her mother, Karen.

Capeesh Restaurant

AD: Capeesh Restaurant

Zuckerberg denied Meta targeted children on Facebook and Instagram despite internal documents suggesting teens were a key priority , and the evidence of the 20-year old who claims she started using Instagram aged nine and became addicted.

Lanier dramatically unfurled a large banner of thousands of selfies KGM had posted during her adolescence, asking Zuckerberg directly: “You expect a nine-year-old to read all of the fine print?

The case has been compared to the Big Tobacco trials of the 1990s, with hundreds of families and schools suing Meta, Google, TikTok and Snapchat.

Oysterian Sea Food Restaurant And Bar

AD: Oysterian Sea Food Restaurant And Bar

He told the jury that Meta does not allow children under 13 on Facebook or Instagram, and insisted the company had worked to remove underage accounts from its platforms.
However, Zuckerberg faced a series of uncomfortable confrontations as the claimant’s lawyer, Mark Lanier, presented internal company documents that appeared to tell a very different story.

One internal Instagram presentation shown to the court read “If we want to win big with teens, we must bring them in as tweens.” A 2017 executive email stated plainly that “Mark has decided the top priority for the company is teens.”

Other mails from 2014 and 2015 showed Zuckerberg himself setting targets to increase time spent on the app by double-digit percentages, directly contradicting his 2024 congressional testimony.

Zuckerberg pushed back against each piece of evidence, saying his earlier remarks to Congress were accurate and that Lanier was “mischaracterising” his words.

He argued that Meta previously used time-spent goals but said the company had since changed its approach.

Lanier dramatically unfurled a large banner of thousands of selfies KGM had posted during her adolescence, asking Zuckerberg directly: “You expect a nine-year-old to read all of the fine print?

The suit — one of the first of many of its kind — alleges that Instagram and other social media platforms owned by Facebook’s parent company were designed to be addictive and harmful to young users, contributing to mental health struggles that have become epidemic among teens.

This is the first time Zuckerberg has personally testified in a civil lawsuit against the company he helped build. His presence on the stand has drawn global attention, not just for the legal ramifications but for what it symbolizes- a moment in which Big Tech’s dominant figures are being held to account in the same way corporate leaders of previous generations were — most notably, tobacco executives in childhood cancer and addiction cases.

The plaintiffs argue that social media platforms including Instagram and YouTube were intentionally engineered to hook young users, manipulate their attention, and cause lasting harm to their mental health. K.G.M. alleges that she began using YouTube as early as age 6 and Instagram by age 10, eventually developing a “dangerous dependency” that fuelled anxiety, depression, body dysmorphia and even self-harm.

Attorneys for the plaintiffs conveyed Meta’s product design as akin to “a playground with no seat belts,” confessing internal concerns about how features like infinite scroll, personalization algorithms, like-buttons, push notifications and beauty filters were tailored to foster engagement — even among children.

Meta’s internal research, presented by plaintiffs’ counsel, reportedly showed millions of underage users on Instagram and suggested the company knew such usage could cause harm.

Zuckerberg’s testimony, delivered in a subdued but tense atmosphere, marked his first direct confrontation with a jury in connection with alleged harms caused by his company’s platforms. Decked in a dark suit and grey tie, he answered questions from both sides as lawyers sought to paint starkly different portraits of Meta’s intentions and responsibilities.

Key Exchanges and Moments

Zuckerberg told the court that Meta did not intentionally engineer its algorithms to addict users and that the company’s goal was to create products people find useful, not to maximize time spent at all costs. He testified that his longtime belief has been to build “community and connection,” and that addiction was not an objective.

When questioned about age verification, Zuckerberg acknowledged challenges in preventing under-13 users from joining Instagram and said efforts to enforce age restrictions had been implemented, even as critics pointed out lapses in those safeguards.

In a dramatic twist, evidence showed that Zuckerberg once overruled concerns from internal wellbeing experts about beauty filters on Instagram — a feature plaintiffs say contributed to body image issues among young users. Zuckerberg defended the decision, framing it as supporting user expression.

In a moment that drew chuckles in the courtroom, Zuckerberg admitted he was “not very good” at making himself relatable — an acknowledgement underscoring the wide gulf between Silicon Valley leaders and everyday jurors hearing emotional testimony about teen harm.

The judge overseeing the case, Judge Carolyn B. Kuhl, even had to issue stern warnings about courtroom decorum, including a ban on AI-powered Meta smart glasses inside the courtroom, highlighting the extraordinary nature of the proceedings.

At its heart, the case targets product design choices and corporate negligence rather than individual acts of user behaviour. Plaintiffs allege that Meta and YouTube knowingly engineered features that enhance engagement at the cost of mental health, disproportionately affecting vulnerable youth. Evidence presented by plaintiffs’ lawyers includes internal presentations, communications and studies indicating awareness of these risks.

Meta, for its part, emphatically denies any deliberate intent to harm. Company representatives say that its platforms enhance connection, provide value to users of all ages, and that any mental health issues experienced by users result from a complex interplay of personal, familial, societal and technological factors, not features designed to exploit psychological vulnerabilities.

Meta has also stressed its investments in safety tools, parental controls, and policies aimed at protecting young users, arguing that lawsuits like this set a dangerous precedent of holding tech platforms liable for social outcomes that are influenced by many variables.

 A Bellwether for Similar Lawsuits

Legal experts and public health advocates view this trial as a bellwether: a precedent-setting case that could influence the outcomes of thousands of similar lawsuits already filed nationwide against Meta, Alphabet (YouTube), Snap, TikTok and others. Some of these suits claim everything from privacy harms to addictive design and mental health impacts.

Judge Kuhl’s prior decision to compel testimony from Zuckerberg and other top executives earlier in the litigation demonstrates how determined courts have become to get to the heart of corporate accountability for social media’s societal effects.

The plaintiffs’ legal strategy here is novel. Rather than focusing on user-posted content — traditionally protected under the U.S. law provision known as Section 230 of the Communications Decency Act — they argue that the very design choices and addictive mechanisms used by platforms constitute negligence or intentional harm. This sidesteps some of the long-standing immunities that tech companies have enjoyed.

Public health law observers note parallels with early tobacco litigation, where internal industry documents demonstrating awareness of harm were key to establishing liability. If plaintiffs can show that Meta knowingly prioritized engagement over safety and that this foreseeably resulted in injury to minors, it could establish a new legal frontier in product liability for tech platforms.

Heritage And Restaurant Lounge Bar

AD: Heritage And Restaurant Lounge Bar

 

Spread the news

Leave a Reply

Your email address will not be published. Required fields are marked *