Political Consultants Fight FCC Move to Restrict AI Robocalls

Oct. 15, 2024, 7:47 PM UTC

One of the nation’s largest organizations of political consultants is fighting a federal push to curb the use of AI in robocalls and robotexts, saying it would complicate automated fundraising appeals to voters.

The American Association of Political Consultants, a bipartisan organization, called the Federal Communications Commission’s proposal to require that callers disclose when they use artificial intelligence to generate phone messages “overly broad.”

The group said that while it “staunchly opposes the use of fraudulent and misleading robocalls and texts in political campaigns,” the proposed requirement that any AI-generated call be disclosed as such would inhibit efforts to increase voter turnout.

“In the time the disclosure is being read out, the voter may have already hung up the call,” the group said in a letter to the commission opposing the proposed rule.

The FCC’s comment deadline for the proposed rule was Oct. 10, and it’s seeking responses to those comments through Oct. 25 before making a final decision on the details of the policy. Under normal timing for federal rules, the proposal will not be in force before this Election Day, Nov. 5.

Seniors Especially at Risk

The proposed rule would apply broadly to AI-generated robocalls and robotexts—not just in political campaigns.

Consumer Reports and AARP backed the FCC’s proposal. AARP said the risks of AI-generated scams are especially pronounced for seniors and including a disclosure in a call would help mitigate that.

“This knowledge will heighten consumers’ awareness of potential fraud and enable them to have more control over the intended AI-generated transaction,” the AARP wrote in a letter to the FCC.

The FCC and Congress began restricting automated calls long before AI tools made it easier for scammers to fake voices. But a deceptive AI-generated call in early 2024 highlighted the political implications of that work. Democratic operative Steve Kramer tapped AI voice cloning technology to fake President Joe Biden‘s voice in calls to New Hampshire Primary voters and discourage them from casting their ballots.

Federal law already bans callers from using systems that automatically dial a list of phone numbers or send text messages without getting the called party’s prior consent. The FCC’s proposal would require marketers get the recipient’s approval to send them text messages that include AI-generated material and are distributed using autodialing technology.

Organizations that use algorithms to generate artificial voices or text messages and then transmit them to consumers would be required to comply. The FCC released the full proposal over the summer.

The rule would force e-commerce brands to reach out to all of their contacts and get permission to use AI-generated content in their messages, said David Carter, president of the Ecommerce Marketers Alliance, which represents more than 50 businesses that primarily sell to consumers online. Those companies often own lists of customer contact information, he said. They wouldn’t be able to use the databases to reach buyers with AI-generated messages without extra labor if the FCC’s proposal takes effect, he said.

“It puts companies really in no-win situation,” Carter said in an interview.

Microsoft Corp also asked the commission not to regulate AI-generated text messages, adding that “consumers are unlikely to mistakenly assume they are communicating with a human” when they receive such a message.

Assistive Technology

Companies told the FCC about AI-powered technology that helps people with disabilities communicate in response to the proposal. The commissioners said in the plan that they are considering exempting such tools from their proposed requirements.

Apple Inc. wrote the FCC about an AI-powered iPhone feature that gives users the power to create a synthetic voice that sounds like their own. It’s designed to allow people at risk of losing their ability to talk, including people with ALS, to preserve the sound of their voice and use it to communicate. Other companies offer similar services.

InnoCaption, a technology that captions phone calls in real time for Deaf and hard of hearing people, said that the FCC should exclude any technology designed to help people with disabilities communicate by voice over the phone and do their jobs. Sales agents with disabilities often rely on AI or prerecorded voices to “conduct legitimate and legal marketing and sales calls,” the company said.

“The FCC should not effectively preclude such individuals from being hired for these roles going forward simply because of their use of assistive technologies,” InnoCaption wrote in a letter to the agency.

To contact the reporter on this story: Courtney Rozen in Washington at crozen@bgov.com

To contact the editors responsible for this story: Gregory Henderson at ghenderson@bloombergindustry.com; Cheryl Saenz at csaenz@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.