New AI tricks make crypto scams harder to spot

WorldwideMon Apr 27 2026
A tech founder nearly lost his crypto wallet after a fake Microsoft Teams call that looked totally real. The call included a face and voice he recognized—someone he’d spoken with before from the Cardano Foundation. Two other people joined, making the setup feel normal. When the video froze, a prompt said his software needed updating. He ran the command, shut his laptop, and only later realized it was a trap. Social engineers have always used trust to trick people. But video calls make scams easier. Before, scammers needed weeks of text chats. Now, a quick deepfake or AI-generated call can do the same job in minutes. Microsoft warned in early 2026 about fake Teams and Zoom files disguised as real updates. Google’s security team found a similar trick using spoofed Zoom meetings and AI-generated executive videos.
The real Cardano contact later said his Telegram was hacked. But the damage was already done. The scammer kept playing the role, asking to reschedule, showing how these attacks can stretch over days. Crypto scams are getting smarter because AI can mimic voices and faces almost perfectly. Chainalysis reported $17 billion lost to crypto fraud in 2025, with AI-powered scams earning 4. 5 times more than old-school tricks. Crypto is a prime target because deals move fast and people trust quick video calls. Attackers don’t just steal money—they use stolen data to launch new scams. Some companies are fighting back with verification tools, like Zoom’s “Verified Human” badge. But Gartner warns that half of businesses won’t adopt these defenses until 2027. Meanwhile, AI is making scams faster and cheaper to run. The lesson? Never trust a video or voice alone. Use backup checks—like a pre-agreed passphrase or a call on a known number. Otherwise, one convincing deepfake could be all it takes.
https://localnews.ai/article/new-ai-tricks-make-crypto-scams-harder-to-spot-f07ed9e5

actions