You may very well get a call in the near future from a relative in dire need of help, asking you to send them money quickly. And you might be convinced its them because, well, you know their voice. 

Artificial intelligence changes that. New generative A.I. tools can create all manner of output from simple text prompts, including essays written in a particular authors style, images worthy of art prizes, andwith just a snippet of someones voice to work withspeech that sounds convincingly like a particular person.

In January, Microsoft researchers demonstrated a text-to-speech A.I. tool that, when given just a three-second audio sample, can closely simulate a persons voice. They did not share the code for others to play around with; instead, they warned that the tool, called VALL-E, may carry potential risks in misusesuch as spoofing voice identification or impersonating a specific speaker.

But similar technology is already out in the wildand scammers are taking advantage of it. If they can find 30 seconds of your voice somewhere online, theres a good chance they can clone itand make it say anything. 

Two years ago, even a year ago, you needed a lot of audio to clone a persons voice. Nowif you have a Facebook pageor if youve recorded a TikTok and your voice is in there for 30 seconds, people can clone your voice, Hany Farid, a digital forensics professor at the University of California at Berkeley, told the Washington Post.

The moneys gone

The Post reported this weekend on the peril, describing how one Canadian family fell victim to scammers using A.I. voice cloningand lost thousand of dollars. Elderly parents were told by a lawyer that their son had killed an American diplomat in a car accident, was in jail, and needed money for legal fees. 

The supposed attorney then purportedly handed the phone over to the son, who told the parents he loved and appreciated them and needed the money. The cloned voice sounded close enough for my parents to truly believe they did speak with me, the son, Benjamin Perkin, told the Post.

The parents sent more than $15,000 through a Bitcoin terminal towell, to scammers, not to their son, as they thought. 

The moneys gone, Perkin told the paper. Theres no insurance. Theres no getting it back. Its gone.

One company that offers a generative A.I. voice tool, ElevenLab, tweeted on Jan. 30 that it was seeing an increasing number of voice cloning misuse cases. The next day, it announced the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab.

Almost all of the malicious content was generated by free, anonymous accounts, it wrote. Additional identity verification is necessary. For this reason, VoiceLab will only be available on paid tiers. (Subscriptions start at $5 per month.)

Card verification wont stop every bad actor, it acknowledged, but it would make users less anonymous and force them to think twice.

Learn how to navigate and strengthen trust in your business with The Trust Factor, a weekly newsletter examining what leaders need to succeed. Sign up here.


Newspapers

Spinning loader

Business

Entertainment

POST GALLERY