If you see your favorite celebrity promoting some sort of product or deal online, think twice before giving away your credit card details. Celebrity deepfakes have been scamming Americans out of billions, and supporters of mega-celebrities like Taylor Swift and Oprah have fallen for it. Experts dove a little deeper with Newsweek on how it’s done and what to do about it if it happens to you.
Altogether, Americans have lost $2.7 billion from the social media scam tactics, according to the Federal Trade Commission. By manipulating a celebrity’s likeness, social media users have found themselves handing over their financial information and losing hundreds if not thousands of dollars.
One scam suggests Taylor Swift is giving out free Le Creuset products by using a synthetic but believable voice and video showing Swift speaking. The ad then asks users to put in their banking information to pay shipping fees before having multiple charges secretly tacked on.
Another scam saw what seemed to be Luke Combs sell weight loss gummies. And yet another had Tom Hanks advertising dental plans.
Oprah was also dubbed online saying she, too, would be gifting cookware sets to fans if they just provided their personal information. And in the current day and age, it might not be immediately obvious it’s a fake at all.
On Complain.Biz, one social media user shared they lost $800 to a fake Michael Saylor bitcoin giveaway.
“The video was a deepfake, super convincing, and I totally fell for it,” they wrote online. “Sent my BTC hoping to double it, but it was a scam. Later I found out that Michael Saylor himself warned about these scams on X.”
Saylor, the MicroStrategy executive chairman, reportedly has to remove 80 AI deepfake videos daily.
“It’s important to underscore the sophistication and believability of the technology,” Tom Blok, the founder of the crypto complaints website Complain.biz, told Newsweek. “Deepfake technology has advanced to a point where it’s becoming increasingly difficult to distinguish real from fake. I have watched the video myself, and even though I knew it was a deepfake I barely could notice it.”
TikTok has become an especially popular medium for scammers, who can easily target a wide range of younger, unaware audiences.
“Notorious personalities such as Elon Musk, Mr. Beast, Sam Altman, Warren Buffet, Joe Rogan, Donald Trump and Tucker Carlson are impersonated in fraudulent endorsements about cryptocurrency exchanges,” Luís Corrons, the security specialist at Avast, told Newsweek. “These fabricated endorsements lure users with promises of substantial bitcoin rewards, setting the stage for the scam.”
Deepfake scams are also invading non-English speaking spaces, with many reported in German, Italian and French as well.
How to Protect Yourself
While you’re online, experts say you should be suspicious if the video of the celebrity looks dated or even just shows their top half.
“If you see only the head and shoulders of a celebrity who is talking, be suspicious right off the bat,” David Notowitz, the president and founder of NCAVF and a deepfake and AI image expert, told Newsweek. “Head and shoulders have already been developed pretty well, but limbs are difficult to create.”
Another commonly used scam tactic is sending a video that shows or sounds like a family or friend in need of help. In these cases, establishing a specific code word that you would use if you actually needed help can be useful, especially for those who are less tech-savvy.
“Consumers are growing rapidly skeptical of what they are seeing online with the rise of AI being used maliciously, rightfully so,” Zulfikar Ramzan, the chief scientist at intelligent safety solutions company Aura, told Newsweek.
“To ensure your loved ones are safe from AI deepfake and voice cloning scams, create a keyword that only your family members know about. If you suspect you’re being scammed, ask for the keyword. If they don’t know it, it’s not one of your family members in danger.”
To verify even the most convincing videos showing celebrity deals, though, you should always double-check their official and verified channels.
If something seems too good to be true, it probably is, Blok said.
Beyond just the financial pain the deepfakes put on consumers, they also destroy the very basic notion that you can trust what you see with your own eyes.
“These deepfake scams aren’t just financially damaging,” Blok said. “They erode trust in digital content, making it also harder for legitimate information to be trusted.”
Congress has proposed two new laws to address the rising prominence of deepfakes online: No Fakes Act and Deepfakes Accountability Act. Neither has been officially passed yet, though.
Today, only California and Florida have any legislation surrounding AI.
Uncommon Knowledge
Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.
Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.
GIPHY App Key not set. Please check settings