As we approach the final stretch before the U.S. presidential elections on November 5, 2024, fake AI endorsements of celebrities are becoming a disruptive force, creating confusion among voters. With AI technology evolving at lightning speed, it is easier than ever to generate misleading images, videos, and audio clips that falsely depict celebrities endorsing political figures they have never publicly supported.
One of the most prominent examples came when a photo of Elton John wearing a pink coat emblazoned with “MAGA” surfaced online, suggesting the iconic singer was backing former President Donald Trump. However, the image was entirely fabricated.
AI Endorsements Are a Growing Problem in the 2024 Election Campaigns
Essentially, AI endorsements are a new phenomenon that leverages artificial intelligence to digitally create or alter content to make it appear as though a celebrity is endorsing a particular political candidate. This can range from simple image manipulation to advanced AI-generated videos that make it look as though a celebrity is speaking or interacting with a politician.
One of the most striking examples is the case of Will Smith and Chris Rock. In a viral AI-generated video, the two stars appeared to be dining with Trump, casually eating spaghetti as if they were endorsing his campaign. The video, which garnered over 700,000 views on X (formerly known as Twitter), was entirely fabricated, but the damage had already been done.
Another example involved Taylor Swift, who was falsely portrayed in AI-generated images as a supporter of Trump. Swift was quick to debunk the claims on her Instagram, clarifying her actual political stance and expressing frustration over how AI technology was being used to spread disinformation. Needless to say, with her massive fanbase, the incident raised significant alarm about the power of AI endorsements to mislead voters.
Why Fake AI Endorsements Are So Dangerous
As we get closer to Election Day, fake AI endorsements are proving to be more than just a nuisance. They are becoming a real threat to the integrity of the democratic process. With millions of voters turning to social media for election-related content, these AI-generated endorsements can quickly distort public opinion.
However, what makes them so dangerous is the fact that they exploit the trust people have in their favorite celebrities.
For many voters, especially those who may not follow politics closely, seeing a beloved celebrity seemingly endorse a candidate can be enough to sway their opinion. A sudden endorsement from someone like Taylor Swift or Will Smith could make headlines and have a powerful influence on undecided voters. But when that endorsement is fake, it manipulates public sentiment based on false information, which can be catastrophic in an already polarized political climate.
How These FAKE Endorsements Are Created?
The technology behind AI endorsements is both fascinating and alarming. AI can now mimic a celebrity’s face, voice, and mannerisms with astonishing accuracy.
Deepfake technology has made it possible to create entirely fabricated videos where a celebrity appears to be speaking or interacting in ways they never did. These deepfakes are becoming harder to detect. Thus, making them a potent tool for spreading disinformation.
In the case of Elton John’s fake MAGA photo, AI technology was used to alter an existing image of the singer, swapping out his usual flamboyant outfit for a pink coat with Trump’s slogan. With just a few clicks, AI tools can now make even the most outrageous scenarios seem plausible.