Conservatives have long advocated for secure elections, but what about the process leading up to them? I’m referring to campaigns, where Alabamians are bombarded with mail, phone calls, social media ads, yard signs, and other forms of communication about candidates and their beliefs.

Alabama has specific rules governing what information can and cannot be communicated to voters. These rules are outlined in the Fair Campaign Practices Act (FCPA), which regulates everything from electioneering communications to campaign finance.

However, the Act has a significant gap that Alabama Legislators need to address: the use of artificial intelligence in political advertising. This is a growing concern for tech-savvy consultants, political operatives, and others in the political world. It’s time for the legislature to amend the FCPA to address this issue.

Many of us have seen images or videos created or altered by artificial intelligence, particularly “deepfakes.” These are videos in which a person’s face or body is digitally manipulated to make it appear as if they are someone else. Deepfakes are often used maliciously or to spread false information, and they are becoming increasingly common in our daily digital lives.

The potential for deepfakes to distort public perception of candidates is a serious concern. The biggest problem is that deepfake videos and images can be nearly indistinguishable from reality, making it difficult for voters to trust what they see.

Last January, an AI-based robocall mimicking President Joe Biden’s voice was sent to thousands of New Hampshire voters, asking them to skip the state’s upcoming primary election. Gov. Ron DeSantis also released a viral deepfaked photo of President Donald Trump hugging Dr. Anthony Fauci during last year’s Republican presidential primary.

This issue is so important that the Federal Election Commission has begun addressing artificial intelligence in its rulemaking process, though it has paused, awaiting congressional action. A bipartisan group of U.S. senators proposed two pieces of legislation: the “Protect Elections from Deceptive AI Act” and the “AI Transparency in Elections Act” to combat campaign deception.

Meanwhile, tech companies have voluntarily pledged to remove political deepfakes, but there are well-founded concerns about the biases and rules that social media companies may impose.

In the meantime, Alabama Legislators should act to protect voters by regulating the use of artificial intelligence in political advertising. At a minimum, lawmakers should require clear disclaimers on AI-generated content and impose civil and criminal penalties for malicious deepfakes. By doing so, Alabama can safeguard its elections from fraudulent political advertising and ensure that voters are not misled or confused.

Nathaniel White is a campaign and policy professional, as well as a self-professed coffee connoisseur. He can be reached at Nathaniel@nwassc.com.

The views and opinions expressed here are those of the author and do not necessarily reflect the policy or position of 1819 News. To comment, please send an email with your name and contact information to Commentary@1819News.com.

Don't miss out! Subscribe to our newsletter and get our top stories every weekday morning.