Big Tech's Dark Side: Profits Over Kids' Safety

USASat Nov 29 2025
A recent court filing has brought to light some shocking claims against Meta, the company behind Facebook and Instagram. It's alleged that Meta had a policy allowing sex traffickers to post harmful content up to 16 times before their accounts were suspended. This is just one of many accusations in a lawsuit that claims Meta, along with other tech giants like Google, Snapchat, and TikTok, prioritized profit over the safety of children. The lawsuit, filed by children, parents, school districts, and states including California, accuses these companies of intentionally designing their products to addict children, despite knowing the harm it causes. The filing claims that Meta's so-called "17x" policy was just one way the company turned a blind eye to the safety of young users. The filing also alleges that Meta's "outright lies" about the harms of its products prevented even the most vigilant parents and teachers from understanding the dangers. Despite earning billions in annual revenue, the filing claims Meta refused to invest enough resources in keeping kids safe. Meta has denied these accusations, stating that they have been making real changes to protect teens for over a decade. However, the filing cites internal communications and research reports that seem to contradict this claim. For instance, it's alleged that in 2020, Instagram had no way for users to report child sexual abuse material, and even when AI tools identified such content with 100% confidence, Meta did not automatically delete it. The filing also claims that Meta's recommendation features were responsible for a significant number of inappropriate adult-minor connections. It's alleged that Meta delayed making kids' accounts private by default for years, allowing billions of unwanted interactions to occur, because they projected that the change would cut daily users by 2. 2%. The lawsuit also takes aim at Meta's approach to children's mental health, claiming that the company's products have contributed to a youth mental health crisis in schools nationwide. Internal messages cited in the filing suggest that Meta knew its products were addictive and harmful, but chose to prioritize user engagement over safety.
https://localnews.ai/article/big-techs-dark-side-profits-over-kids-safety-60152ed2

questions

    How might the delay in implementing privacy features for teens' accounts be justified from a business perspective?
    How might the focus on child safety in the lawsuit be used to advance broader agendas related to technology and society?
    Could the delay in implementing privacy features for teens' accounts be a deliberate strategy to maximize data collection?

actions