The Bloomberg Law 2024 series previews the themes and topics that our legal analysts will be watching closely in 2024. Our Artificial Intelligence analyses explore the most compelling challenges that generative AI will bring to legal professionals in the year ahead.
If 2023 is the year parents need to talk with their school-aged kids about using artificial intelligence appropriately, then 2024 will be the year that the SEC will have a similar talking-to with investment advisers regarding their ongoing use of the technology.
AI, such as deep learning or machine learning, has been around in one form or another in the investment management industry for years, and it’s almost inextricably interwoven into many operational and investment decision-making workflows. So, the SEC’s “talk” with investment advisers will be more like a long overdue—yet notably indirect—intervention, perhaps predicated on SEC Chair Gary Gensler’s stated belief that the regulator needs to find ways to control AI use within the industry before it catalyzes the next big financial meltdown.
One of the SEC’s strategies for addressing AI is to create rules that lay out expectations for investment advisers as they engage in certain practices or deal with outside products and professionals. Consequently, when such engagement leverages any technology, especially AI, an investment adviser will assume some responsibility. This is particularly important as outsourcing to third parties gains prominence in the industry.
It is through this lens that I’ve identified at least six rule proposals which, if adopted, will introduce new oversight obligations for investment advisers that use or benefit from AI, even though most of the proposals seldom—if ever—mention the words “artificial intelligence,” “AI,” or a related term.
Rule Proposals That Address AI Use Directly
The SEC hadn’t begun to directly address investment advisers with sets of rules to which they must adhere when using AI specifically until recently.
In a third-party outsourcing rule proposal calling for more accountability for investment advisers that outsource certain functions to a vendor, the SEC in October 2022 subtly signaled its interest in holding investment advisers responsible for conducting initial due diligence and ongoing oversight of vendors’ use of AI to perform certain functions.
The conflict-of-interest rule proposed in July more squarely addressed AI use and would require investment advisers to eliminate or neutralize any conflicts of interest resulting from predictive data analytics, AI, and other “covered technology” (which is broadly defined in the proposal).
Rule Proposals That Address AI Use Indirectly
The following rule proposals don’t target AI specifically, but practitioners should take note of the implications of the proposed rules and how they may indirectly address an investment adviser’s use of AI.
Proposed in February, the safeguarding client assets rule would modernize the “custody rule” that governs investment advisers’ safeguarding practices over client funds and securities. The rule currently requires an investment adviser to entrust its client’s funds to a custodian, resulting in the client and the custodian entering into a custody agreement, with the investment adviser taking on limited oversight.
The proposal, by contrast, would increase instances in which the investment adviser would be considered to maintain possession of client funds and securities (though the proposed rule replaces “funds and securities” with “assets”). Also, the investment adviser would be required to oversee the custodian through a separate agreement. Additionally, the investment adviser would have to ensure the custodian complies with the custody agreement.
The investment adviser’s contract with the custodian, at minimum, would make the investment adviser responsible for understanding how the custodian is monitoring client assets, including any crypto assets. Many such monitoring tools employ AI. Further, if a client’s assets are misplaced or impaired, the harmed client—or the SEC—could point to this contract, as it collects evidence that the investment adviser failed in its fiduciary duty of care.
In May 2022, the SEC proposed an ESG disclosure rule that would set a disclosure framework for investment advisers and funds that identify as pursuing an environmental, social, and governance (ESG) strategy. The rule would require, via a prospectus and annual reporting, an investment adviser to provide an overview of how it incorporates ESG factors into its process for evaluating, selecting, or even excluding investments.
Obtaining, categorizing, and measuring much of the sought-after ESG data is generally difficult, and AI is helping with this issue. Many investment advisers rely on third-party data to make informed ESG-related investment decisions. Third parties themselves use AI to scan millions of pages from various sources to compile and generate retail-consumable reports. Even if an investment adviser were to use an internal methodology, it would still need to be mindful of whether or not its process involves relying on an AI-powered tool at any step, and whether it would need to understand how AI is used and how it pulls and/or sorts data from various sources. Either process for obtaining ESG data would require an adviser or fund to understand how AI is used prior to making the requisite disclosures.
Two other SEC-proposed rules have unspoken but logical connections to AI oversight for investment advisers. Both rules involve overseeing functions that are increasingly relying on machine learning to undertake the proposed responsibility effectively.
A cybersecurity risk management proposal, published in February 2022, requires funds or advisers to adopt and implement written policies and procedures to address cybersecurity risks and create new reporting, disclosure and record-keeping obligations. AI can make an enterprise more susceptible to cyber threats, as noted in an executive order issued last month focused on responsible AI use. And interestingly, standard cybersecurity tools are AI-backed. If adopted, the rule would create a unique responsibility for investment advisers to both manage cybersecurity vulnerabilities resulting from firmwide AI use and the AI that powers the cybersecurity tool itself.
Similarly, in March, a proposal updating Regulation S-P would require funds or advisers to take measures to, among other things, detect unauthorized access to customer information. Data breach detection tools often utilize AI and investment advisers would be responsible for ensuring the tool achieves, and is in compliance with, the obligations of the investment adviser under the proposed rule.
The SEC’s Iron Hook
The legal basis for these proposed rules won’t surprise most practitioners. It is the investment adviser’s fiduciary duty to always act in the best interest of its clients.
Presumably, the proposed rules preemptively illuminate the SEC’s posture toward the affirmative acts it believes investment advisers should be taking regarding their use of AI. Consequently, whether or not the SEC directly targets AI, the SEC and harmed clients can still pursue legal remedies if they can show an investment adviser did not fulfill its fiduciary duty under the Investment Advisers Act of 1940. An investment adviser would be wise to consider its responsibility to proactively use best practices when utilizing any emerging technology.
A final fun fact: Dating back to 2013, EXAM Priorities—a document published annually by the SEC detailing what its compliance exam teams will focus on for the year—have never mentioned the words “artificial intelligence.” But that changed this year. The just-released 2024 EXAM Priorities specifically states that the SEC has established a specialized team to, in part, address emerging issues associated with AI. Whether such a change is long overdue, just in time, or unnecessarily sparked by the recent hysteria around the pitfalls of generative AI, the SEC will have its sights set on AI in investment management in 2024.
Access additional analyses from our Bloomberg Law 2024 series here, covering trends in Litigation, Transactions & Contracts, Artificial Intelligence, Regulatory & Compliance, and the Practice of Law.
Bloomberg Law subscribers can find related content in our AI Legal Issues Toolkit resource.
If you’re reading this on the Bloomberg Terminal, please run BLAW OUT <GO> in order to access the hyperlinked content, or click here to view the web version of this article.
To contact the reporter on this story:
To contact the editor responsible for this story: Robert Combs at rcombs@bloomberglaw.com
Learn more about Bloomberg Law or Log In to keep reading:
See Breaking News in Context
Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.
Already a subscriber?
Log in to keep reading or access research tools and resources.