Voice-Cloning AI: A New Tool for Scammers
AI News

Voice-Cloning AI: A New Tool for Scammers

Scammers have a new trick up their sleeves: voice-cloning AI tools that allow them to sound like victims’ relatives in desperate need of financial help. This emerging technology can simulate a person’s voice with just a snippet of audio, making it possible for fraudsters to clone your voice and make it say anything. The consequences can be dire, as some Canadian families learned the hard way, losing thousands of dollars to scammers using voice-cloning AI.

How does this technology work?

New generative AI tools can create all manner of output from simple text prompts, including essays written in a particular author’s style, images worthy of art prizes, and with just a small audio sample, speech that sounds like a particular person. In January, Microsoft researchers unveiled a text-to-speech AI tool called VALL-E, which could simulate a person’s voice with just a three-second audio sample. However, Microsoft did not release the code to the public, citing the tool’s potential risks if misused, such as voice identification spoofing or impersonating a specific speaker.

Despite the risks, similar technology is already out in the wild, and scammers are taking advantage of it. If they can find 30 seconds of your voice somewhere online, they can clone it and make it say anything. Even a Facebook page or a TikTok recording with 30 seconds of your voice can be enough for scammers to clone your voice. Hany Farid, a digital forensics professor at the University of California, Berkeley, warned that voice-cloning AI is now so advanced that scammers no longer need a lot of audio to clone a person’s voice.

Are there any victims?

A Canadian family fell victim to scammers using voice-cloning AI and lost thousands of dollars, reported the Washington Post. The scammer impersonated a lawyer who told the elderly parents that their son had killed an American diplomat in a car accident, was in jail, and needed money for legal fees. The scammer then purportedly handed the phone over to the son, who told the parents he loved and appreciated them and needed the money. The cloned voice sounded “close enough for my parents to truly believe they did speak with me,” the son, Benjamin Perkin, told the Post. The parents sent more than $15,000 through a Bitcoin terminal to the scammers, not to their son as they thought.

VoiceLab

One company that offers a generative AI voice tool, ElevenLabs, tweeted that it was seeing an increasing number of voice cloning misuse cases. The next day, it announced that the voice cloning capability would no longer be available to users of the free version of its tool, VoiceLab. The company acknowledged that card verification would not stop every bad actor, but it would make users less anonymous and force them to think twice.

In Conclusion

As voice-cloning AI continues to advance, the risks of voice fraud and other types of impersonation will only increase. The onus is on individuals and organizations to take steps to protect themselves, such as enabling two-factor authentication on their accounts and being cautious when receiving unexpected requests for money or sensitive information. As with any new technology, it’s crucial to stay informed and take steps to mitigate the risks.

What is your reaction?

Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
Megan O'Connor
Megan is a big data analyst based in Dublin. With a strong background in both data analysis and AI, she is well-equipped to tackle complex problems. In her free time, she enjoys traveling and exploring new cultures.

    You may also like

    More in:AI News