Home | News

The AI political campaign is here

Donie O'Sullivan and Yahya Abou-Ghazala


May 3, 2023

President Barack Obama’s 2008 election campaign has often been celebrated as the first to effectively use social media as a mobilization tool to capture the White House. In the 15 years since, the technology has gone from being a novel addition to a political campaign to transcending every aspect of one.

Now, a transformative and largely untested technology looks set to revolutionize political campaigning: artificial intelligence. But the computer-generated content, which blurs the line between fact and fiction, is raising concerns ahead of the 2024 presidential election.

The Republican National Committee threw down the gauntlet last week when it released a 30-second advertisement responding to President Joe Biden’s official announcement that he would seek reelection in 2024.

The ad, uploaded to YouTube, imagined a dystopian United States after the reelection of the 46th president, presenting stark images of migrants flooding across the US border, a city on lockdown with soldiers on the streets, and Chinese jets raining bombs on Taiwan.

But none of the foreboding images in the video were real – they were all created using AI technology.

Last week, CNN showed the ad to potential voters in Washington, DC. While some were able to identify that the images in it were fake, others were not. After watching scenes of heavily armed military personnel patrolling the streets of San Francisco during a lockdown sparked by surging crime and a “fentanyl crisis,” one person CNN spoke to was left wondering if the imagined episode had actually happened.

Therein lies the problem, said Hany Farid, a digital forensic expert and professor at the University of California, Berkeley.

Imagined realities and deceptive ads are nothing new in political campaigns. Lyndon B. Johnson’s 1964 presidential campaign brought forth the so-called “Daisy Girl” attack ad, which imagined a nuclear apocalypse were his opponent Barry Goldwater to win.