Deep fakes in politics – when seeing is no longer believing

January 28, 2024

We have well and truly entered an age where you cannot trust what you see online. 

While that statement has been partially true for decades, AI has elevated content manipulation to new levels, massively outpacing public awareness.

AI deep fake technology can create and alter images, videos, and audio recordings, putting words into the mouths of public figures or making them appear in situations that never occurred.

In many cases, it takes more than a second glance to determine the authenticity of content, and fake media can accumulate millions of impressions before being identified.

We’re now witnessing deep fakes that can potentially interrupt democratic processes, though it’s too early to measure tangible impacts on voting behavior.

Let’s examine some of the most notable political AI deep fake incidents we’ve witnessed to date.

Joe Biden New Hampshire incident

January 2024, A New Hampshire, US, a robocall mimicking Biden’s voice encouraged voters to “save your vote for the November election,” wrongly suggesting that participating in the primary would inadvertently benefit Donald Trump. 

It was attributed to Kathy Sullivan, a former state Democratic Party chair’s personal cellphone number. Sullivan condemned the act as a blatant form of election interference and personal harassment. 

The New Hampshire Attorney General’s office said this was an illegal effort to disrupt the presidential primary and suppress voter turnout. 

The fabricated audio was identified to have been generated using ElevenLabs, an industry leader in speech synthesis.

ElevenLabs later suspended the culprit behind the fake Biden voice and said, “We are dedicated to preventing the misuse of audio AI tools and take any incidents of misuse extremely seriously.”

German Chancellor Olaf Scholz deep fake incident

November 2023, Germany witnessed an AI deep fake falsely depicting Chancellor Olaf Scholz endorsing a ban on the far-right Alternative for Germany (AfD) party. 

This deep fake video was part of a campaign by an art-activism group, the Center for Political Beauty (CPB), and aims to draw attention to the rising influence of AfD. Critiques of the AfD are foreshadowed against Germany’s 1930s history. 

Led by philosopher and artist Philipp Ruch, the CPB group aims to create “political poetry” and “moral beauty,” addressing key contemporary issues like human rights violations, dictatorship, and genocide​​.

The CPB has engaged in numerous controversial projects, such as the “Search for Us” installation near the Bundestag, which they claimed contained soil from former death camps and remains of Holocaust victims. 

While support for the AfD has grown, numerous protests across Germany demonstrate a strong opposition to AfD’s ideologies.

A spokesperson for the group behind the deep fake stated, “In our eyes, the right-wing extremism in Germany that sits in parliament is more dangerous.”

AfD officials called the deep fake campaign a deceptive tactic aimed at discrediting the party and influencing public opinion. 

UK Prime Minster Rishi Sunak implicated in scams

In January 2024, a UK research company found that PM Rishi Sunak was involved in over 100 deceptive video adverts disseminated primarily on Facebook, reaching an estimated 400,000 individuals. 

These ads, originating from various countries, including the US, Turkey, Malaysia, and the Philippines, promoted fraudulent investment schemes falsely associated with high-profile figures like Elon Musk.

The research, conducted by the online communications company Fenimore Harper, highlighted how social media companies simply aren’t responding to this form of content in a reasonable timeframe. 

Fake news
One deep fake ad pulled users into this fake BBC news page promoting a scam investment. Source: Fenimore Harper.

Marcus Beard, the founder of Fenimore Harper, explained how AI democratises misinformation: “With the advent of cheap, easy-to-use voice and face cloning, it takes very little knowledge and expertise to use a person’s likeness for malicious purposes.”

Beard also criticized the inadequacy of content moderation on social media, noting, “These adverts are against several of Facebook’s advertising policies. However, very few of the ads we encountered appear to have been removed.”

The UK government responded to the risk of fraudulent deep fakes: “We are working extensively across government to ensure we are ready to rapidly respond to any threats to our democratic processes through our defending democracy taskforce and dedicated government teams.” 

Pakistan Prime Minster Imran Khan appears in virtual rally

In December 2023, the former Prime Minister of Pakistan, Imran Khan, currently imprisoned on charges of leaking state secrets, appeared at a virtual rally using AI.

Despite being behind bars, Khan’s digital avatar was remarkably watched by millions. The rally contained footage from past speeches involving his political party, Pakistan Tehreek-e-Insaaf (PTI).

Khan’s four-minute speech spoke of resilience and defiance against the political repression faced by PTI members. 

The AI voice articulated: “Our party is not allowed to hold public rallies. Our people are being kidnapped and their families are being harassed,” continuing, “History will remember your sacrifices.” 

Confusing the situation, the Pakistan government allegedly tried to block access to the rally.

NetBlocks, an internet monitoring organization, stated, “Metrics show major social media platforms were restricted in Pakistan for [nearly] 7 hours on Sunday evening during an online political gathering; the incident is consistent with previous instances of internet censorship targeting opposition leader Imran Khan and his party PTI.”

Usama Khilji, a proponent of free speech in Pakistan, commented, “With a full crackdown on PTI’s right to freedom of association and speech via arrests of leadership, the party’s use of artificial intelligence to broadcast a virtual speech in the words of its incarcerated chairman and former Prime Minister Imran Khan marks a new point in the use of technology in Pakistani politics.” 

Fake audio of ex-Sudanese president Omar al-Bashir on TikTok

An AI-powered campaign on TikTok exploited the voice of former Sudanese President Omar al-Bashir amid the country’s ongoing civil turmoil. 

Since late August 2023, an anonymous account posted what it claims to be “leaked recordings” of al-Bashir. However, analysts determined the recordings were AI-generated fakes.

al-Bashir’ has been absent from the public eye since he was ousted from the country in 2019 due to serious war crime allegations. 

Slovakia’s election day audio scam

On the day of Slovakia’s election, a controversial audio featured deep fakes of Michal Šimečka, leader of the Progressive Slovakia party, and journalist Monika Tódová discussing corrupt practices like vote-buying. 

This surfaced within Slovakia’s pre-election media blackout, so the implicated individuals couldn’t easily publicly refute their involvement before post-time. 

Both implicated parties later denounced the recording as a deep fake, which a fact-checking agency confirmed. 

Volodymyr Zelenskiy’s deep fake

In 2023, a deep fake video of Ukrainian President Volodymyr Zelenskiy, which amateurishly suggested he was calling soldiers to abandon their posts, was quickly identified as fake and removed by major social media platforms.

Turkish election deep fake drama

In the lead-up to Turkey’s parliamentary and presidential elections, a video falsely showing President Recep Tayyip Erdoğan’s main challenger, Kemal Kılıçdaroğlu, receiving support from the PKK was spread online. 

Donald Trump deep fakes

In early 2023, we witnessed realistic-looking deep fakes of Donald Trump being arrested and a campaign video by Ron DeSantis featuring AI-generated images of Trump embracing Anthony Fauci. 

Trump AI
AI images of Trump being arrested caused a raucous back in March 2023.

Belgian political party’s Trump deep fake

An earlier incident in 2018 in Belgium caused political uproar when a deep fake video created by a political party caused a public outcry.

The video falsely depicted President Donald Trump advising Belgium to withdraw from the Paris climate agreement. 

The video was part of a high-tech forgery, which was later acknowledged by the party’s media team. It demonstrated how deep fakes can be used to fabricate statements by world leaders to influence public opinion and policy.

Deep fake of Nancy Pelosi

A manipulated video of Nancy Pelosi in 2020, made to appear as though she was slurring her words and intoxicated, spread rapidly on social media. 

This demonstrated the potential of deep fakes to discredit and embarrass public figures, which often persists after the content is deemed fake. 

Audio deepfake of Keir Starmer

Another incident in British politics involved an audio clip allegedly capturing opposition leader Sir Keir Starmer swearing at his staff. 

The clip, widely circulated on social media, was later revealed to be an AI-generated deep fake.

As tech companies explore ways to tackle deep fakes at scale, the AI models used to create fake media will only become more sophisticated and easier to use.

The journey ahead demands a collaborative effort among technologists, policymakers, and the public to harness AI’s benefits while safeguarding our society’s pillars of trust and integrity.

Trust in politics and public institutions is already flimsy, to say the least. Deep fakes will further undermine it.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions