/bmi/media/media_files/2025/06/09/hcWXyrri7qD6qkZezMNS.jpg)
New Delhi: Meta allegedly shut down internal research into the mental health effects of Facebook after finding causal evidence that its products harmed users, according to news reports from unredacted filings in a lawsuit by US school districts against Meta and other social media platforms.
In a 2020 research project code-named “Project Mercury,” Meta scientists worked with survey firm Nielsen to examine the effect of “deactivating” Facebook. Internal documents reportedly showed that “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.”
Rather than publishing the findings or pursuing further research, Meta is said to have called off the project and internally described the negative study results as influenced by the “existing media narrative” around the company.
Privately, however, staff reportedly maintained the conclusions were valid. “The Nielsen study does show causal impact on social comparison,” an unnamed staff researcher allegedly wrote. Another staffer expressed concern that keeping the negative findings quiet was akin to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite its own research showing a causal link between Facebook use and negative mental health effects, Meta is alleged to have told Congress it could not quantify whether its products harmed teenage girls.
Meta spokesman Andy Stone said the study was stopped because its methodology was flawed and that the company had worked to improve the safety of its products. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
The allegation against Meta is part of a broader filing by Motley Rice, a law firm representing school districts nationwide in a suit against Meta, Google, TikTok and Snapchat. Plaintiffs claim the companies intentionally concealed internally recognised risks from users, parents and teachers.
Allegations include encouraging children under 13 to use the platforms, failing to address child sexual abuse content, and attempting to expand social media use among teenagers while at school. The filing also claims the platforms tried to pay child-focused organisations to publicly defend the safety of their products.
/bmi/media/agency_attachments/KAKPsR4kHI0ik7widvjr.png)
Follow Us