We were warned. Technology experts said emerging artificial intelligence (AI) will make dangerous scams even more terrifying. It’s already happened in Arizona.
Jennifer DeStefano of Scottsdale, Ariz. almost let the call from the unfamiliar number go to voicemail, but her 15-year-old daughter was on a ski trip and she worried there might have been an accident. When she answered the phone she was subjected to a parent’s worst nightmare.
In an interview with local TV stations, DeStefano said the first thing she heard was her daughter’s voice, distraught and sobbing. When the alarmed mother asked what happened, the voice said she had “messed up,” and continued crying.
DeStefano said a man came on the line, saying “I’ve got your daughter” and threatened harm to her daughter if she called the police. All the while she said she heard her daughter in the background, crying and asking for help.
“It was never a question of who is this? It was completely her voice. It was her inflection. It was the way she would have cried,” DeStefano told the stations. “I never doubted for one second it was her.”
But it wasn’t. A scammer had recorded the teenager’s voice, most likely from a social media video. They were able to then reproduce the voice, down to tone and inflection, and have it say whatever they wanted it to say.
Sandy Fliderman, CTO at Industry Fintech, says “deep fakes” like the one DeStefano experienced are dangerous and will become more common.
“Deep fakes are AI-generated videos and voices that impersonate real people like celebrities, politicians, and in recent cases, bankers,” Fliderman recently told us. “As AI primarily learns more about individuals through social media, like family and pet names, they can also generate what are more likely to be correct passwords.”
DeStefano’s story has a happy ending. While the “kidnapper” was lowering his ransom demands from $1 million to $50,000 DeStefano’s friends called 911 as well as her husband, who called his daughter and found she was relaxing in her room.
Slow things down
AI is going to make this emotionally-charged scam even more dangerous but there are some things to consider if this happens to you. Law enforcement experts say the first thing you should do is slow things down.
Ask questions. The scammers may know some things about the supposed victim but if the “victim” is not in their possession, there will likely be questions they can’t answer, or will answer incorrectly.
Here’s another red flag: if the call comes from an unfamiliar area code it’s less likely to be real. A scammer will also go to great lengths to keep you on the line so you can’t check on the whereabouts of the person they say they are holding for ransom.