AMPYX CYBER

View Original

Are scam callers using a Donald Trump copycat voice to trick you out of money?

BY KERRY TOMLINSON, AMPERE NEWS

He calls himself Wayne, but for many listeners, it's really "The Donald.""

Criminals launched a series of robocalls with AI-generated voices to make you think someone is making charges on your account.

This time, they're employing a voice that sounds suspiciously like a deepfake of Donald Trump.

Watch here:

Wayne from Walmart

Has Donald Trump taken a new job as a customer service rep at one of the world's largest companies?

No, but scammers may be trying to get your attention with a copycat voice.

The recorded call begins by saying, "This is Wayne from Walmart." But for many, it sounds like Trump, or at least a deepfake version of him.

The goal of the fake voice is to scare you into pressing one and speaking to a live representative about misuse of your account.

Computer science professor V.S. Subrahmanian at Northwestern University said criminals may have been drawing on the speaking skills of top political figures.

"They became president because they were good orators. They could speak well. They could hold people's attention. And perhaps that could be a reason," said Subrahmanian. "Without having the perpetrators arrested, it's hard to tell."

Inside the scam

In this scam, the attackers want you to believe you're going to be wrongly charged almost $1,000 for a PlayStation 5 you didn't order, and you need to talk to a rep to get the charge removed.

"To cancel your order or to connect with one of our customer support representatives, please press one," the Trump-like voice says.

Press that number and you'll likely get connected to a fake rep who manipulates you into giving up your credit card number and/or going to a fake refund site. Some fake refund sites can download malware onto your devices that can steal your bank passwords --- and the money in your account.

Stealing voices

If it is indeed a Trump deepfake, it would not be the first time shady dealers have deployed his voice as a lure.

Criminals made a deepfake of Trump to promote a cryptocurrency scam, security company Elliptic reported in August.

“We are giving away bitcoins and ethereums [sic] to you… click the link to claim yours," the deepfake Trump said in a video, according to Elliptic.

Another scam video showed a deepfake Trump offering up "American Monetary Fund" or AMF checks that could be cashed for $10,000 after the election, Palo Alto reported in a blog post update in September.

In addition, a deepfake Trump voice promoted "Free Trump gold" in an ad on YouTube last fall. At the end of the video, a screen popped up with a note in small print that the voice is computer-generated and that the real Trump is not affiliated with the ad.

Other fakes

His opponent, Kamala Harris, also found herself faked.

One deepfake of a podcast interview dipped into vulgarity, with a fake Harris saying she would become the "H-- in Chief" in the White House.

Another fake Harris campaign ad features video taken from a UK home insurance ad. The repurposed video shows a boy wearing a dress and makeup and causing havoc in a home. The deepfake voice of Harris expresses support for abortion rights and ends with a fake offensive comment about children. As the boy stands on a table and throws out handfuls of glitter, the voice says that abortion rights are important “so we don't wind up living with a child like this."

"Skepticism and verification are key," Subrahmanian said.

Detection system

Subrahmanian and his team created the Global Online Deepfake Detection System to identify deepfakes for journalists.

They use technology and humans to analyze and verify audio and video clips, like the "Wayne from Walmart" call, which they conclude was AI-generated. They also look at the context, or outside information, surrounding the clip in question.

Don't rely on what you see and hear, Subrahmanian advised. Criminals, candidates, and other countries are using deepfakes to try to manipulate you.

During the East Coast hurricanes, AI-generated pictures of Trump and Joe Biden heroically wading through deep water appeared on social media.

"Every time you see a picture like that, you've got to be extremely skeptical. Or a video like that," Subrahmanian said.

Do you believe?

There is another factor at play. The fake does not have to be high quality to work for you. You may consciously or subconsciously want to believe it's real, so you'll accept the fake as reality.

The solution? When it comes to deepfakes related to politics and/or money, put yourself on pause. Don't share or pay until you can verify through many sources, not just some accounts on social media or one media outlet.

And if you get a call from a fake Trump with a side job at Walmart?

Cut him off, Subrahmanian said, before he can pull his scam on you. 

"If you don't believe a conversation right from the beginning," he said, "disengage. Hang up."

What to do

This call is from a series of fake Walmart calls possibly starting in the spring and continuing through the fall. A recording of the Trump soundalike first appeared on the Nomorobo website in July 2024.

Criminals used more than a half dozen different AI-generated voices, but only one that sounded like such a well-known person.

This story includes advice on what to do if you get this fake Walmart call and this story provides recommendations on how to deal with deepfakes.

ALSO IN THE NEWS:

MORE FROM AMPERE NEWS

 

See this content in the original post

Featured Stories

See this gallery in the original post