FILE PHOTO: People walk behind a logo of Meta Platforms company, during a conference in Mumbai, India, September 20, 2023. REUTERS/Francis Mascarenhas/File Photo
redo Jump to...
print Print...
NOTE: Mark Zuckerberg founded Facebook in 2004, and the company rebranded to Meta in 2021. Meta is the parent company that now owns Facebook, Instagram, WhatsApp, and other subsidiaries. Zuckerberg remains the founder, chairman, and CEO of the parent company, Meta, which is responsible for setting the overall direction and product strategy.
(Reuters) – Meta shut down internal research into the mental health effects of Facebook and Instagram after finding causal evidence that its products harmed users’ mental health, according to unredacted filings in a class action by US school districts against Meta and other social media platforms.
In a 2020 research project code-named “Project Mercury,” Meta scientists worked with survey firm Nielsen to gauge the effect of “deactivating” Facebook and Instagram, according to Meta documents obtained via discovery. To the company’s disappointment, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness, and social comparison,” internal documents said.
Rather than publishing those findings or pursuing additional research, the filing states, Meta called off further work and internally declared that the negative study findings were tainted by the “existing media narrative” around the company.
Privately, however, staff assured Nick Clegg, Meta’s then-head of global public policy, that the conclusions of the research were valid.
“The Nielsen study does show causal impact on social comparison,” (unhappy face emoji), an unnamed staff researcher allegedly wrote. Another staffer worried that keeping quiet about negative findings would be akin to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite Meta’s own work documenting a causal link between its products and negative mental health effects, the filing alleges, Meta told Congress that it had no ability to quantify whether its products were harmful to teenage girls.
In a statement Saturday, Meta spokesman Andy Stone said the study was stopped because its methodology was flawed and that it worked diligently to improve the safety of its products.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
Plaintiffs allege product risks were hidden
The allegation of Meta burying evidence of social media harms is just one of many in a late Friday [Nov. 21, 2025] filing by Motley Rice, a law firm suing Meta, Google, TikTok, and Snapchat on behalf of school districts around the country. Broadly, the plaintiffs argue that the companies have intentionally hidden the internally recognized risks of their products from users, parents, and teachers.
TikTok, Google, and Snapchat did not immediately respond to a request for comment.
Allegations against Meta and its rivals include tacitly encouraging children below the age of 13 to use their platforms, failing to address child sexual abuse content, and seeking to expand the use of social media products by teenagers while they are at school. The plaintiffs also allege that the platforms attempted to pay child-focused organizations to defend the safety of their products in public.
In one instance, TikTok sponsored the National PTA and then internally boasted about its ability to influence the child-focused organization. Per the filing, TikTok officials said the PTA would “do whatever we want going forward in the fall… (t)hey’ll announce things publicly, their CEO will do press statements for us.”
By and large, however, the allegations against the other social media platforms are less detailed than those against Meta. The internal documents cited by the plaintiffs allege:
- Meta intentionally designed its youth safety features to be ineffective and rarely used, and blocked testing of safety features that it feared might be harmful to growth.
- Meta required users to be caught 17 times attempting to traffic people for sex before it would remove them from its platform, which a document described as “a very, very, very high strike threshold.”
- Meta recognized that optimizing its products to increase teen engagement resulted in serving them more harmful content, but did so anyway.
- Meta stalled internal efforts to prevent child predators from contacting minors for years due to growth concerns and pressured safety staff to circulate arguments justifying its decision not to act.
In a text message in 2021, Mark Zuckerberg said that he wouldn’t say that child safety was his top concern “when I have a number of other areas I’m more focused on like building the metaverse.” Zuckerberg also shot down or ignored requests by Nick Clegg to better fund child safety work.
Meta’s spokesman Andy Stone disputed these allegations, saying the company’s teen safety measures are effective and that the company’s current policy is to remove accounts as soon as they are flagged for sex trafficking.
He said the suit misrepresents its efforts to build safety features for teens and parents, and called its safety work “broadly effective.”
“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions,” Stone said.
The underlying Meta documents cited in the filing are not public, and Meta has filed a motion to strike the documents. Stone said the objection was to the overbroad nature of what plaintiffs are seeking to unseal, not unsealing in its entirety.
A hearing regarding the filing is set for January 26, 2026 in Northern California District Court.
Published at NY Post on Nov. 23, 2025. Reprinted here for educational purposes only. May not be reproduced on other websites without permission.
Questions
1. Define the following as used in the article:
- causal (NOT casual)
- causal evidence
- unredacted
- class action
- allegation
- plaintiff
2. a) Who is the CEO of Meta?
b) What platforms does Meta own?
3. What was Meta’s Project Mercury? – What did the study conclude?
4. What actions did Meta take in response to Project Mercury’s findings?
5. a) Name the other social media companies involved in the lawsuit.
b) What does the class action lawsuit brought by US school districts against Meta and other social media platforms allege in general? and what do they say the companies do and don’t do?
6. What specific allegations are made in the lawsuit against Meta?
7. What evidence shows Mark Zuckerberg’s disregard for potential harm his company causes children/teens?
8. Who is Andy Stone? What do you think of his response to the allegations made in the lawsuit? (see paragraphs 15-17)
9. a) How many of Meta’s social media apps do you use? – How much time do you spend on social media?
b) Would the report on this causal evidence against Meta cause you to deactivate your accounts? Explain your answer.
10. A November 7 news article reported that Meta knowingly allowed scammers to put ads on its sites because they accounted for 10% of its revenue – even though scammers harm their Meta’s users. (See “Meta relied on scam ad profits to fund AI, report shows“)
—Meta’s company mission is “building community and bringing the world closer together.” The company also has a vision of bringing the metaverse to life to help people connect, find communities, and grow businesses.
—Google owns YouTube. Google’s former motto was “Don’t be evil,” which was replaced in 2015 by the parent company, Alphabet, with “Do the right thing.” The new phrase encourages employees to “follow the law, act honorably, and treat each other with respect.”
Results of a search “Does YouTube cause harm” note in part: Studies and mental health experts indicate that YouTube can cause harm, particularly concerning addiction, exposure to inappropriate content, and negative impacts on mental and physical health, especially among children and adolescents.”
Chinese company TikTok’s mission statement is: “Inspire Creativity & Bring Joy.”
Partial results from a search “Does TikTok cause harm” reveals:
-Excessive use of TikTok has been linked to higher levels of anxiety, depression, and psychological distress, especially in young users. In one study, even short-term usage of 20 minutes was associated with an increase in anxiety and depression scores.
-Addictive Design: The platform’s algorithm is specifically designed to maximize user engagement through personalized, auto-playing videos and intermittent rewards (likes, comments), which can lead to compulsive or addictive use and a loss of the perception of time (often called “doomscrolling”).
a) Do you think each of these companies fulfill their missions?
b) Do the pros of social media use outweigh the cons?
Background
From a Nov. 24 Grok query “Explain the buried evidence details”:
The Buried Evidence: What Happened with Meta’s Internal Research?
On November 21, 2025 (just a couple days ago), a bombshell filing dropped in a major class-action lawsuit against Meta, Google, TikTok, and Snapchat. The suit, brought by U.S. school districts represented by the law firm Motley Rice, accuses these companies of hiding the mental health risks their platforms pose to kids and teens.
The standout allegation? Meta deliberately shut down and buried internal research that provided causal evidence — not just correlation, but direct proof — that Facebook and Instagram were harming users’ mental health. This isn’t some vague whistleblower tip; it’s based on unredacted internal documents obtained through court discovery. Let’s break it down step by step.
The Study: “Project Mercury” and What It Revealed
What was it? In 2020, Meta ran an internal experiment called Project Mercury. They partnered with the survey firm Nielsen to test the real-world impact of their apps. Researchers asked participants to deactivate their Facebook and Instagram accounts for one week and then surveyed them on mental health metrics.
The shocking findings: To Meta’s dismay, the results showed clear causal harm. People who took the break reported:
- Lower levels of depression.
- Reduced anxiety.
- Less loneliness.
- Decreased social comparison (that FOMO feeling when you’re scrolling through everyone else’s “perfect” lives).
Why “causal”?
This wasn’t just observational data (like “teens who use IG a lot seem sadder”). The deactivation setup created a controlled “before and after,” proving the apps themselves were driving the negative effects. Internal docs called it a “causal impact on social comparison.” …
How Meta “Buried” It
Immediate shutdown: Instead of digging deeper or sharing the results (which could have informed safety features or public warnings), Meta halted the entire project. No follow-up studies, no publications—just crickets.
The excuse: Internally, they dismissed the findings as “tainted by the existing media narrative” around Big Tech scrutiny. Basically, “This can’t be real because it makes us look bad.” But privately, researchers pushed back hard. One staffer messaged Nick Clegg (Meta’s former head of global public policy) with: “The Nielsen study does show causal impact on social comparison”—complete with a sad-face emoji.
Cover-up in public: Despite this damning internal knowledge, Meta told Congress it had “no ability to quantify whether its products were harmful to teenage girls.” That’s the kind of double-speak that fuels conspiracy vibes.
Broader Allegations in the Filing
This isn’t isolated — the lawsuit paints Meta as prioritizing growth over safety:
Weak safeguards: They allegedly designed youth protection features (like parental controls or time limits) to be ineffective and underused on purpose. Why? Because robust ones might cut into engagement (and ad revenue).
Blocked testing: Meta nixed experiments on stronger safety tools, fearing they’d hurt “growth” metrics.
Other platforms: The filing hits Google, TikTok, and Snapchat too, but Meta’s section is the meatiest. Accusations include encouraging under-13s to sign up, ignoring child sexual abuse material, and pushing teen usage during school hours.
Meta’s Defense
Meta’s spokesman, Andy Stone, fired back on November 22, calling the study’s methodology “flawed” and insisting they’ve been proactive for over a decade: “We have listened to parents, researched issues that matter most, and made real changes to protect teens.” They claim the full court record will vindicate them. Skeptical? Yeah, given the internal docs contradict that narrative.
Why This Matters (Especially for Teens)
If true, this is a gut punch — Meta knew their apps could spike depression and anxiety in young users but chose silence over solutions. It’s not just about one study; it’s evidence of a pattern where profits trump well-being. The lawsuit could force real accountability, like mandatory safety audits or age-based restrictions. For now, it’s a reminder: That endless scroll? It might be engineered to keep you hooked, even if it messes with your head.
This story’s blowing up in real-time—expect more leaks as the case heats up. Stay tuned, and maybe log off for a week… just to test it yourself?
Daily “Answers” emails are provided for Daily News Articles, Tuesday’s World Events and Friday’s News Quiz.