AI/deepfake cases are multiplying at alarming rates

Scammers only need three seconds of a person’s voice to impersonate them for voice cloning

Photo (c) Kentoh - Getty Images

Wake up and smell the fakes! Artificial Intelligence (AI) has now moved into an extremely troubling position as scammers are using it to con and steal from consumers.

According to the latest fraud statistics from Sumsub, a global verification platform, the top three fraud types in the U.S. and Canada are now:

  • Liveness bypass, a method of fraud where criminals swap in or edit biometric data (34% of fraud in the U.S. and 22% in Canada) 

  • Edited ID card (22% in the U.S. and 24% in Canada) 

  • Forged ID card (13% in the U.S. and 18% in Canada) 

The key element here is digital forgery – the ability of a scammer to take just enough of a person’s facial or vocal features to reverse engineer a virtual clone of just about anyone. 

The number of deepfakes has more than doubled in North America. And it could get worse.

Pavel Goldman-Kalaydin, head of AI & ML at Sumsub, implied the issue is quickly getting out of hand. Apparently, anti-fraud and verification providers do not work constantly to update deepfake detection technologies and are lagging behind, putting both businesses and users at risk.

How deepfakes are created

To create a deepfake, a fraudster uses a real person’s document, taking a photo from it and turning it into a 3D persona, Goldman-Kalaydin explained. 

“We’ve seen a pattern of forced verification globally, when it is visible that a person whose photo is taken or who’s passing the liveness check is doing so involuntarily while being held by others’ force. It is alarming that the proportion of such fraud is growing," Goldman-Kalaydin said.

Interestingly, a lot of this is happening without any of us knowing what's going on.

"Sometimes the person being verified is obviously unconscious—maybe sleeping, maybe not feeling well, or maybe under the influence of substances. This means that he or she is not actively and agreeably participating in the 'Know Your Customer' process, which may lead to crime and financial fraud,” he added.

ConsumerAffairs asked Sumsub if it could share some examples of how a deepfake might be created. In this series of photos, you’ll see the original – “real” – photo of a person, followed by a deepfake of that person as a woman, then another as a hipster having a drink.

Wait till you see what a scammer can do to your voice in just 3 seconds

Voice cloning scams are becoming more common, too, targeting victims’ money and personal information, according to Cybercrime Support Network, CSN. And guess what? Cybercriminals only need a few seconds of a person’s voice in order to create a recording and carry out these AI-powered scams.

Scammers can get these vocal snippets from anywhere, too, so be careful. “They dig up personal info from social media or hacked accounts, then use AI tools to create realistic fake recordings of their target's voice,”  CSN expert Ally Armeson said.

“Once they have enough information, scammers can use voice cloning technology to create a convincing fake recording. They may use the recording to make it seem like the loved one is in distress.”

How? As an example Armeson shared that a scammer might use the voice to pretend that a person was in a car accident, have been kidnapped, or are in jail and need bail money.

“They may ask for money to be wired or transferred to a specific account to help them out of the situation,” she said.

Protecting yourself against voice cloning scams

It is possible to avoid these scams but you’ve got to stay informed and take precautions. Armeson gives three tips to anyone who wants to protect themselves against AI-powered emergency scams:

Use a family code word. Establish a family code word that only you and your loved ones know. In an emergency, this word can be used to verify that the person on the other end of the line is who they claim to be. Keep the code word private among your family and friends.

Be cautious of unsolicited calls or messages. Don't trust anyone who claims to be a loved one when they call or text you unexpectedly. Before sending money, verify the person's identity. To verify, hang up and call the person directly 

Be careful with your personal information. Be careful with the content and information you share online – especially videos – and limit the amount of personal information you make public. Make your profiles private so that only friends and family can see what you post.

Take an Identity Theft Quiz. Get matched with an Authorized Partner.