FCC Election Deepfake Ads Proposal Sparks Turf Fight With FEC

Aug. 8, 2024, 9:13 AM UTC

Proposed rules for disclosing when AI-generated content appears in radio and television political ads have spurred debate over whether the Federal Communications Commission is the best government agency to regulate on the topic.

The FCC’s proposed rules, released July 25, would require broadcast radio and television stations and cable operators to provide on-air announcements when content generated using artificial intelligence is used in political ads. Depending on its timeline, it could outpace Federal Election Commission efforts to regulate AI-created content in campaigns that have been under consideration since last year.

“The idea is that you—as the voter, as the viewer, as the listener—deserve to know,” FCC Chairwoman Jessica Rosenworcel told Bloomberg Law, likening the AI-content notification to how political ads must disclose who paid for them. Pointing to her agency’s longstanding authority over the airwaves, she added, “This is simple, it’s transparent, and it’s built on the foundation of something we’ve been doing for decades.”

Some regulators and advocates of AI regulation have praised the proposal advanced by the FCC’s Democratic majority as a positive step during a campaign cycle that’s already seen deepfakes—viral, hyper-realistic AI images—of politicians be deployed. Others—among them the FCC’s two Republican commissioners and their FEC counterparts—say regulation regarding political disclosures should be the FEC’s purview.

While it’s unclear whether regulation can even be finalized before the November election, there’s also disagreement over whether the run-up to it is the best time to discuss AI guardrails. The debate comes as federal regulators are grappling with how to respond to the US Supreme Court’s decision in Loper Bright Enterprises v. Raimondo that limits the power of federal agencies to defend their rules in court.

People should be disturbed the FCC is “pushing ahead with its radical plan to change the rules on political ads mere weeks before the general election,” FEC Chairman Sean Cooksey, a Republican, said in an emailed statement.

The FCC’s notice was published in the Federal Register Aug. 5, giving the public until Sept. 4 to answer the FCC’s questions and to voice comments and concerns. Reply comments are due Sept. 19, meaning the proposal will still up in the air by the time early voting begins in Virginia, Minnesota, and South Dakota on Sept. 20.

Generative AI’s use for disinformation does pose a concern for political campaigns and voters. Notably, a robocall in January faked a message from President Joe Biden discouraging voters before New Hampshire’s primary election. A May national survey published by Elon University found 78% of those polled expect AI abuses will affect the outcome of the presidential election, and 70% aren’t confident in most voters’ ability to detect altered or faked visual and audio content.

The FCC starting the rulemaking process will “help address a serious problem in the 2024 election, but it’s also going to increase public attention towards what the problem is—and that alone is worthwhile,” said Craig Holman, government affairs lobbyist for the consumer advocacy group Public Citizen.

In a separate move, the FCC on Wednesday voted to propose new restrictions on the use of AI-generated voice clones in phone calls to voters.

Jurisdictional Disputes

The FCC’s broadcast deepfakes proposal encroaches on FEC jurisdiction, Cooksey said in a statement calling for the FCC to abandon its effort.

Public Citizen petitioned the FEC in July 2023 to consider rules on deepfakes in campaign ads, pointing to existing statutory provisions that prohibit “fraudulent misrepresentation.” Commissioners had previously deadlocked in a vote on whether such regulations would be within their jurisdiction.

The FEC unanimously advanced the petition last August, for the first time soliciting public comment on political deepfakes. Action on the petition is expected to be resolved later this year, according to Cooksey’s office.

Both Republican FCC members shared the FEC chairman’s sentiment.

“Why risk stepping in front of an ongoing rulemaking of a sister agency, addressing squarely the same question, over that agency’s objection, relying on uncertain authority?” FCC Commissioner Nathan Simington said in his dissent.

Republican FCC Commissioner Brendan Carr pointed out in his dissent that the rules wouldn’t apply to online streaming or social media, where many see political ads.

Democratic FEC Vice Chair Ellen Weintraub, however, lent her support to the FCC’s initative in a June 6 letter. No single agency has the authority to address the issue entirely, she wrote, and it would be beneficial for both to conduct rulemakings.

Daniel Weiner, director of the Brennan Center for Justice’s elections and government program, said overlapping jurisdictions are common.

“The idea that somehow, because the FEC regulates political communications, that no other agency has any role in doing so just isn’t consistent with long-established practice,” Weiner said.

Rosenworcel said the agencies’ authorities are complementary: The FEC can regulate AI use in online ads for federal candidates, while the FCC can oversee ads on TV and radio.

Leveraging the agency’s decades-old foundation in broadcast and cable regulation can “help it meet this moment,” she said.

The Road Ahead

With the presidential campaign already well underway, it’s unclear if the FCC would, or could finalize the rule before Election Day, given the proposal’s 30-day comment and 45-day reply periods and time needed to revise and vote.

“There’s a lot of variables along the way,” Rosenworcel said. “But the one thing I think we can do right now is start a conversation, because the public deserves to know when this technology is being used, and transparency has a big role to play in democracy.”

Whether the FCC should enact the rule before votes are cast is an appropriate topic for the public to weigh in on, Weiner said. He said he’s “a bit skeptical” the disclosure rules would be difficult for campaigns to comply with if implemented.

The FCC could face a challenge of its authority given the US Supreme Court’s ruling in Loper Bright to scrap the decades-old Chevron doctrine empowering federal regulators, Weiner said. Carr’s dissent likewise said the FCC should show restraint in regulation to ensure it’s “operating within the statutorily defined bounds of its authority.”

While AI regulation will take a patchwork approach across various levels of government, Carnegie Mellon University machine learning and public policy professor Rayid Ghani said the FCC’s proposal is an effective step forward.

“It’s not enough,” he said, “but it’s a start.”

To contact the reporter on this story: Jorja Siemons in Washington at jsiemons@bloombergindustry.com

To contact the editors responsible for this story: Tonia Moore at tmoore@bloombergindustry.com; Adam M. Taylor at ataylor@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.