The phone rings. The caller claims to be one of your relatives – often a grandchild – and they’re in panic mode.
They’re in some sort of distress – a car accident, or they've been arrested. They need you to send money ASAP and, of course, you – grandma or grandpa – will do just like every other grandparent out there and do what you can to help.
Yep, it’s the “grandparent scam,” but it’s not the original version. It’s evolved thanks to artificial intelligence and voice cloning.
In the last year since ConsumerAffairs first reported on the anticipated threat of voice cloning, the grandparent scam has become so convincing that a man fatally shot an Uber driver because he wrongly assumed she was part of a scam to extract $12,000 in supposed bond money for a nephew. But even consumer professionals aren’t immune. Watch this first-person account of how a consumer advocate in the fight against fraud wound up a victim.
Business is good for the scammers. One grandparent scam ring in Canada reportedly hauled in more than $2 million before they were tracked down and arrested.
AI makes these scams more convincing because, once they capture a few seconds of the family member's voice, they can make that "family member" say anything, just by typing in dialog.
How to stay safe in this AI-driven world
With robocalls, caller ID and screening apps did a pretty good job of cutting these scammers off at the pass, but modern software can spoof phone numbers. When a call seems to come from a familiar number, many people will pick it up thinking that they’re safe.
But, cybersecurity watchers are pushing something new that scammers might not be able to get around: “AI safewords”.
This concept is gaining traction and is pretty easy to put into play. You simply encourage family members and friends to establish unique phrases or challenge questions only they would know. This can help verify identity over the phone.
The other safeguard all of us can do is take off all videos you’ve posted to social media. "If, for example, your Facebook is open, you don't have it locked down so anyone can kind of peruse your profile.
"If there are even 15 seconds of audio available on the internet of the person whose voice you're trying to emulate, that can be enough," Truman Kain, a cybersecurity expert from Huntress, told FOX23 News.
In Kain's opinion, people shouldn't leave their social media unlocked and open. Those privacy settings need to be locked down, he said. And here’s a video that shows you how to do just that: