Jan 16 2026TECHNOLOGY
AI Chatbots: A New Trick for Stealing Data
Security experts have uncovered a sneaky way to steal information from AI chatbots like Microsoft Copilot. This trick, called Reprompt, lets hackers grab sensitive data with just one click on a seemingly safe link. The worst part? The victim doesn't even need to interact with the chatbot after that
reading time less than a minute