eFinder

Meta told to pay $375m for misleading users over child safety

Analysis Summary

Propaganda Score
40% (confidence: 80%)
Summary
A New Mexico court ordered Meta to pay $375 million for misleading users about child safety on its platforms. The verdict, which Meta plans to appeal, cites internal documents and whistleblower testimony about risks to minors. The state alleges Meta's algorithms exposed children to explicit content and predators, while Meta defends its safety measures and recent features aimed at protecting young users.

Topics

Child safety Corporate accountability Legal actions against tech companies

Detected Techniques

  • Loaded Language (confidence: 70%)
    Using words with strong emotional connotations to influence an audience.
  • Bandwagon (confidence: 80%)
    Persuading the audience by suggesting that many people already support the idea.
  • Slogans (confidence: 90%)
    Using a brief, striking phrase to provoke an emotional reaction.

Fact-Check Results

“Meta told to pay $375m for misleading users over child safety”
INSUFFICIENT EVIDENCE — No evidence found in archive to confirm or refute the $375m penalty claim
“A jury found that Meta was liable for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators”
INSUFFICIENT EVIDENCE — No evidence found in archive to verify jury liability findings against Meta
“The verdict is 'historic' and marks the first time that a state has successfully sued Meta over child safety issues”
INSUFFICIENT EVIDENCE — No evidence found in archive to confirm historicity or first-state-success claim
“Meta was responsible for violating New Mexico's Unfair Practices Act because it misled the public about the safety of its platforms for young users”
INSUFFICIENT EVIDENCE — No evidence found in archive to verify Unfair Practices Act violation claim
“The trial lasted seven weeks, during which jurors were presented with internal Meta documents and heard testimony from former employees about how the company had been aware of child predators using its platforms”
INSUFFICIENT EVIDENCE — No evidence found in archive to confirm trial duration or document/testimony details
“Arturo Béjar, a former engineering leader at Meta, testified to various experiments he ran on Instagram that showed underage users were served sexualized content”
INSUFFICIENT EVIDENCE — No evidence found in archive to verify Arturo Béjar's testimony about content delivery
“State prosecutors showed internal Meta research that, at one point, found 16% of all Instagram users had reported being shown unwanted nudity or sexual activity in a single week”
INSUFFICIENT EVIDENCE — No evidence found in archive to confirm the 16% user reporting statistic
“The total civil penalty of $375m was reached after the jury decided there were thousands of violations of the act, each with a maximum penalty of $5,000”
INSUFFICIENT EVIDENCE — No evidence found in archive to verify penalty calculation methodology
“Meta is also involved in a separate trial in Los Angeles, in which a young woman claims that she became addicted to platforms like Instagram and YouTube as a child because of how they are intentionally designed”
INSUFFICIENT EVIDENCE — No evidence found in archive to confirm separate LA trial allegations
“New Mexico sued Meta in 2023, claiming the company 'steered' young users to content that was sexually explicit, showed child sexual abuse, or exposed them to solicitation of such material and sex trafficking”
INSUFFICIENT EVIDENCE — No evidence found in archive to verify New Mexico's 2023 lawsuit allegations
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew”
PENDING