4 Ways AI Can Be Used and Abused in the 2024 Election: From Deepfakes to Foreign Interference

Artificial intelligence (AI) has become a topic of concern for the American public in relation to the 2024 election, with worries about its potential to generate and spread false information. However, a closer examination of AI’s role in the current election cycle reveals that most uses of this technology are not entirely novel. While AI does possess the capability to manipulate voters and disseminate lies on a large scale, its current applications largely resemble updated versions of familiar election activities.

One significant development in AI technology is the launch of ChatGPT in 2022, which has brought generative AI into the public consciousness. This technology can generate text responses to user prompts, making it capable of providing information about the 2024 election. Users can ask questions about voting information, candidates, and other election-related topics. However, it is important to note that generative AI can also produce misinformation, with some instances of AI-generated responses being inaccurate or incomplete. Users should verify the results of AI searches and exercise caution when relying solely on AI-generated information.

Deepfakes, another application of AI, involve the fabrication of highly convincing images, audio, and video that mimic reality. These can be used to deceive voters or manipulate public opinion. While deepfakes have not overwhelmed the ads seen by voters in the 2024 election, they have been utilized by candidates across the political spectrum for various purposes, including deception. Former President Donald Trump has even hinted at the use of deepfakes when questioning the crowd size at Vice President Kamala Harris’ campaign events. Such allegations aim to discredit opponents by sowing doubt about the authenticity of truthful content, a tactic that has been employed in previous elections.

Concerns have also been raised about the potential use of AI by election deniers to distract election administrators through frivolous public records requests. While there is currently no evidence of this happening, the efficiency of AI could amplify such challenges, diverting resources from critical tasks, disenfranchising legitimate voters, and disrupting the election process.

Foreign interference in U.S. politics, exemplified by confirmed Russian meddling in the 2016 election, remains a pressing concern. AI could potentially enhance these efforts, as seen in the use of social media bot farms by Russian actors to influence public opinion. There is also evidence of China using AI to spread malicious information about the U.S., such as transcribing a Biden speech inaccurately to suggest sexual references. However, it is important to note that foreign meddling in U.S. politics predates AI technology, as demonstrated by historical examples like British intelligence officers attempting to discredit isolationist candidates in the 1940s.

Efforts to regulate the use of AI in electoral politics face challenges, with federal regulation facing an uphill battle similar to other proposals aimed at regulating political campaigns. Some states have taken action, with 19 currently banning or restricting deepfakes in political campaigns. Certain platforms engage in self-moderation, such as Google’s Gemini, which refrains from providing responses on elections and political figures. Campaign professionals have also expressed concerns about potential pushback from voters if they discover a campaign is utilizing AI technology.

While public concern over AI’s role in elections can serve as a guardrail against potential abuses, it can also further erode trust in the electoral process. It is crucial to acknowledge that AI is not necessary for foreign meddling or the dissemination of false information, as historical examples demonstrate. Bad actors can leverage the efficiencies embedded in AI to pose significant challenges to election operations and integrity.

Share this post

Let's Create free account on audie. No credit card required, give Author’s Voice a try!

Resend Verification Email

Enter your email to receive the verification code

[uv_resend_verification_form]