The SEC’s Plan to Regulate AI on Wall Street Is Already Outdated

Feb. 23, 2024, 9:30 AM UTC

We’re experiencing a societal transformation from artificial intelligence that will dwarf the advent of the printed word, automation, and the internet.

But instead of allowing actual experience to dictate where and how financial market supervision will change, the Securities and Exchange Commission is poised to pre-emptively adopt a rule dictating how investment advisers and broker-dealers may use AI technologies.

While styled as a consumer protection measure, adoption of the Conflicts of Interest Associated with the Use of Predictive Data Analytics by Broker Dealers and Investment Advisers proposal would hinder adoption of advanced technology by imposing the specter of retroactive liability, subjectively imposed, on investment advisers and broker-dealers.

The dangers of this proposal are so great that it attracted attention in Congress, with Sens. Ted Cruz (R-Tex.) and Bill Hagerty (R-Tenn.) introducing legislation to bar its adoption.

In structure, the PDA proposal embodies a classic bureaucratic land grab:

  • First, it defines the covered technologies, termed predictive data analytics, in the broadest of terms. Machine learning and generative AI tools are included, of course, but so are Excel spreadsheets with formula-embedded cells, which is 1980s-era technology.
  • It then requires investment advisers to inventory all uses of covered technology—an impossible task for any adviser, and one that will require massive resources to even attempt.
  • Each adviser then is required to assess which uses of covered technology potentially give rise to conflicts of interest. While that may seem appropriate and limited, a quick read of any offering document demonstrates how financial industry lawyers can find conflicts of interest in every aspect of an advisory or brokerage relationship.

The final element is a novel requirement that advisers and brokers using AI, machine learning or other predictive data analytics “eliminate” all identified conflicts. Elimination is an untested standard without objective criteria that would overrule decades of precedent under the Investment Advisers Act and wreak similar havoc in brokerage arrangements.

Without any serious limiting principles, any issue at any investment adviser or broker-dealer that can be linked to any use of covered technology implicates a substantive penalty cloaked in procedural failure allegations. It would be the rare prosecutor who couldn’t find an unidentified conflict or an identified one that was not “eliminated.”

Ironically, the proposal’s fatal flaw is that the dramatic rate of change in AI technology already has made it obsolete. Since its publication just a few months ago, we’ve gone from requesting haikus from generative AI chat websites to integrated functions in our everyday applications. Machines now produce initial drafts of prose and code on par with junior analysts, and machine-learning algorithms outperform humans at a growing list of tasks.

Sadly, demonstrated and potential malicious uses of AI (ranging from AI-enhanced hacking attemps to deepfake pornography) are also increasing by the day. The proposal’s concerns over issues such as data ownership and a generic (and duplicative) requirement to ensure an undefined measure of cybersecurity are almost quaint.

The SEC’s approach stands in stark contrast to those by its sister agency, the Commodity Futures Trading Commission. In its Jan. 25 request for comment on using AI in regulated markets, the CFTC demonstrated curiosity and humility when asking the industry to describe current and expected use cases of AI, identify the related risks, and provide any other information that would promote sensible regulation.

A new age of market technology is upon us, and history will reflect kindly on regulators that embrace the change, invest the effort to understand it, and present limited and thoughtful reforms when and as warranted. This SEC has an historic opportunity to be that kind of regulator, and the first step on that path is to withdraw the PDA proposal and engage with the industry in a joint education effort.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Author Information

Brian T. Daly is partner at Akin and has represented more than 100 clients before the SEC.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Rebecca Baker at rbaker@bloombergindustry.com; Jada Chin at jchin@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

Learn About Bloomberg Law

AI-powered legal analytics, workflow tools and premium legal & business news.

Already a subscriber?

Log in to keep reading or access research tools.