Don’t let your ears deceive you: spotting and stopping AI Voice scams
As Generative Artificial Intelligence (genAI) programs have become widely available, phishing tactics have evolved alongside them, rapidly increasing in both attainability and sophistication. As genAI models grow and evolve, their audiovisual output improves, allowing scammers to create compelling forgeries.
An emerging phishing tactic that has gained notoriety for its cunning efficacy is the novel “AI phone scam,” where bad actors extort their targets using an effective imitation of a loved one or colleague’s voice. Targets may be tricked into providing anything from social security numbers to banking credentials.
What is phishing?
“Phishing” is a tactic in which bad actors will attempt to retrieve sensitive information from their targets by pretending to be a friend, loved one, or reputable institution. Phishers typically use this sensitive information to commit financial fraud, but may commit other cybercrimes, from introducing malware into a target’s device to outright identity theft.
How does AI voice phishing work?
Voice-cloning programs are one of the newest AI tools at the public’s disposal – including bad actors. While they can be used for entertainment purposes, such as reciting custom lyrics in the voice of your favorite popular artist, scammers may use voice-cloning AI to create convincing forgeries.
By feeding recorded snippets of your voice into these tools, scammers are able to create a believable audio clone. From there, they create a script for contacting your loved ones and impersonating you. This can include calling one of your family members to ask for money, calling your place of work to request confidential information, or even using a distressed sample of your voice to stage a kidnapping and demand a ransom.
How can I protect myself?
Voice-cloning programs are realistic but rarely seamless. Listen closely for warped audio, sudden and lengthy pauses, or distortions that give the call an overall unnatural feel.
Another sign that the call may not be legitimate is urgent, unverifiable demands for money or sensitive information. These calls may be highly out of character for the person you believe is on the other line, or involve significant pressure to comply with unreasonable demands. Resist the urge to automatically cooperate – retrieve as many details as you can, and if the caller becomes threatening, contact the police.
Finally, rather than waiting to be on the receiving end of a voice-cloning scam, you can proactively set up a code word with your loved ones. Ensure this code word is not easily guessed or published on any of your personal accounts. The next time an urgent financial need arises, you and your loved ones can use the agreed upon code word to alleviate fears of a potential scam.
By keeping an ear out for suspicious slip-ups, distortions, or uncharacteristic demands during phone calls, you can better defend against imposters and preserve your privacy. To learn more about phishing tactics and how to thwart them, visit GetProtected.