You’re probably seen the ridiculous videos on social media. There’s a celebrity—maybe it’s Sydney Sweeney or Tom Hanks—talking directly to the camera about some product, but something seems a bit off. Maybe it’s the fact that their mouth doesn’t seem to be moving perfectly in sync with their words or maybe it’s that Hanks is trying to sell some sketchy-looking medication you’ve never heard of.
Well, you’re right to be skeptical. Thanks to artificial intelligence technology, celebrities are being used left and right in AI deepfake videos and images for all kinds of scams, hawking products they’ve never actually endorsed. And cybersecurity company McAfee has a new list rounding up the most commonly used celebrity likenesses of the past year.
Scarlett Johnasson tops the list, which is particularly interesting given her advocacy against non-consensual AI content. Johnasson said she was approached by OpenAI to provide a voice for their robotic assistant tech, but the actress declined. OpenAI went ahead and made a sound-alike voice for one of the company’s demos but seems to have scrapped the voice after Johansson threatened a lawsuit.
Other big names on the list include Kylie Jenner, Taylor Swift, and Tom Hanks, among others. Hanks wrote a note on Instagram back in August warning that AI fakes were using his likeness to sell medications.
“There are multiple ads over the internet falsely using my name, likeness, and voice promoting miracle cures and wonder drugs,” Hanks wrote. “These ads have been created without my consent, fraudulently and through AI. I have nothing to do with these posts or the products and treatments, or the spokespeople touting these cures.”
Hanks went on to explain that he has type-2 diabetes and said, “I ONLY work with my board-certified doctor regarding my treatment,” ending his message with an all-caps warning: “DO NOT BE FOOLED. DO NOT BE SWINDLED. DO NOT LOSE YOUR HARD EARNED MONEY.”
Because ultimately, that’s what it’s all about. Making money by fraudulently piggybacking off the likeness of well-known people.
The list from McAfee, with the company’s explanations:
The one celebrity who’s not on the list but we at Gizmodo see most often? That would be Elon Musk. We even obtained consumer complaints filed with the FTC earlier this year about crypto scams using Musk’s face. They’re everywhere.
The folks at McAfee warn that as AI gets better, it will be harder and harder to tell what is a deepfake and what is the real thing. So people will just need to remain vigilant and try to apply some critical thinking skills whenever a supposed celebrity endorsement pops into their social media feed.
“In a time when celebrity news is part of everyday conversation and accessible with the click of a button, people often prioritize convenience over online safety, clicking on suspicious links promising celebrity content or related goods,” Abhishek Karnik, McAfee’s Head of Threat Research, said in a statement published online.
“But if it sounds too good to be true, it’s worth a second look. With cybercriminals using advanced AI tools to create more convincing scams, the risks are growing, and celebrity names are the perfect bait for curious consumers. That’s why people need to stay vigilant and think twice before clicking.”