California’s recently enacted laws to combat election deepfakes are facing a legal challenge after a lawsuit was filed in Sacramento. The state, known for its tough stance on deepfakes, had signed three landmark proposals into law earlier this week. However, two of these laws are now being contested in court.
One of the laws, which allows individuals to sue for damages over election deepfakes, has already taken effect. The other law requires large online platforms, such as X, to remove deceptive material starting next year. The lawsuit, filed by a person who created parody videos featuring altered audios of Vice President Kamala Harris, argues that these laws infringe on free speech and enable anyone to take legal action over content they dislike.
The governor’s office has defended the legislation, stating that it does not ban satire and parody content but rather requires the disclosure of the use of AI in altered videos or images. The spokesperson for Governor Gavin Newsom, Izzy Gardon, questioned the motive behind the conservative activist’s lawsuit, highlighting that similar disclosure laws exist in other states, including Alabama.
The lawsuit appears to be one of the first legal challenges to such legislation in the United States. Theodore Frank, the attorney representing the complainant, argues that the California laws are overly broad and intended to force social media companies to censor and harass individuals. Frank also indicated plans to file another lawsuit challenging similar laws in Minnesota.
Several states have proposed similar legislation in response to the growing threat of election disinformation amplified by AI technology. California’s laws, signed by Governor Newsom, aim to prevent the spread of deepfakes that could impact voting decisions and misrepresent election integrity. The laws cover not only political candidates but also election workers and voting machines. They make it illegal to create and publish false materials related to elections within 120 days before Election Day and 60 days thereafter, with violators facing civil penalties.
Critics, including free speech advocates and Elon Musk, argue that the new California laws are unconstitutional and violate the First Amendment. Musk, the owner of the social media platform X, shared an AI-generated video featuring altered audios of Kamala Harris, questioning the legality of the parody video under the new legislation.
While the effectiveness of these laws in curbing election deepfakes remains uncertain, Ilana Beller of Public Citizen, a nonprofit consumer advocacy organization, suggests that having such laws in place could serve as a deterrent. However, the slow pace of the courts in responding to the rapid dissemination of fake content poses challenges. Beller emphasizes the importance of swift action to minimize the spread and impact of deepfakes.
In addition to the laws already in effect, Governor Newsom also signed another law requiring campaigns to disclose AI-generated materials starting next year, following the 2024 election.