Whistleblowers have pulled back the curtain on a high-stakes “arms race” between social media giants. The core allegation? Meta and TikTok have systematically traded the safety of their youngest users for “engagement” and market dominance.
Thank you for reading this post, don't forget to subscribe!1. Meta: The “Placebo” Safety Strategy
Former insiders, including Arturo Béjar and Jason Sattizahn, allege that Meta’s leadership didn’t just ignore safety—they actively suppressed evidence of harm.
- Tampering with Data: Claims suggest Meta “doctored” internal research to hide the prevalence of grooming and sexual harassment in VR environments.
- Ineffective Tools: Safety features were allegedly kept as “placebos”—tools that look good in PR statements but are unmaintained or ineffective in practice.
- The AI Pivot: To cut costs, Meta is replacing human moderators with AI, despite internal warnings that automation cannot yet detect complex child exploitation.
2. TikTok: Engineering Addiction & Extremism
TikTok’s algorithm is accused of being a “propaganda battlefield” designed to trigger high-arousal emotions like outrage to keep users scrolling.
- Affective Conditioning: Rather than showing users what is accurate, the algorithm prioritizes what is emotionally charged.
- Extremist Rabbit Holes: Investigations in 2026 found that the algorithm frequently pushes far-right extremist content to new users because polarizing content generates the most “watch time.”
- Moderation Vacuum: Following massive layoffs of human staff, whistleblowers warn that the platform is now a “wild west” for self-harm content and illegal trade.
The Global Crackdown
The era of self-regulation appears to be ending. Governments are now using these whistleblower testimonies as “smoking guns” in court:
| Jurisdiction | The Legal Response |
| United States | Over 40 states are suing for billions, alleging intentional “addictive design.” |
| European Union | Invoking the Digital Services Act to fine platforms for using deceptive “dark patterns.” |
| United Kingdom | Ofcom is enforcing the Online Safety Act, requiring platforms to prove their algorithms aren’t toxic. |
“They figured kids drive engagement, and engagement makes them cash. Everything else was secondary.” — Jason Sattizahn, Former Meta Researcher















