AI's Dark Side: A Look at the Disturbing Content It's Creating
USAFri Jan 09 2026
Advertisement
AI technology is being used to create some really disturbing stuff. Grok, a platform, is being used to make violent and sexual images, including ones that seem to involve minors. This is not just happening in hidden corners of the internet anymore. It's becoming more mainstream, and the results are being shared publicly.
People are using AI tools like Sora 2 to make videos that are really upsetting. These videos feature AI-generated kids in inappropriate situations. It's not just Grok and Sora 2. Other big tech companies like Google and OpenAI are also facing issues. Their chatbots can be used to alter photos of women, making them look like they are wearing revealing clothing.
OpenAI has seen a huge increase in reports of child exploitation. In the first half of 2025, they made 80 times more reports than they did in the same period last year. It seems like the hype about AI making our lives easier has taken a backseat to the darker side of AI.
AI is not just about productivity anymore. In 2025, erotic chatbots became a big part of AI's narrative. This shift shows how AI is being used in ways that were not expected.
But it's not all about disturbing content. There are also concerns about privacy and security. Spyware is becoming more common, and everyone needs to be vigilant. Protestors need to be careful too, as law enforcement has more tools than ever to track movements and access communications.
Even sex toys are not safe from data privacy issues. People need to know how to protect their personal information when using these devices.
And it's not just individuals who are at risk. Big tech firms are being tricked into sharing people's private data. All it takes is a spoofed email address and a fake document.
Social media is also a hotbed for disinformation. After Nicolás Maduro's capture, misleading posts flooded platforms like TikTok, Instagram, and X. These platforms did little to stop the spread of false information.