- Akerman attorneys review compliance guidance on AI technology
- Businesses must evaluate risk assessment, governance framework
The US Department of Justice last month unveiled revisions to its Evaluation of Corporate Compliance Programs to account for, among other things, recent developments in companies’ use of artificial intelligence technologies.
The revised text of the ECCP sets forth questions and considerations that the DOJ will raise in any criminal investigation into a company’s use of AI that leads to compliance failures or failure to detect criminal activity.
Per the revised ECCP, businesses that use AI should consider and document these questions in a risk assessment:
- How does the company assess the potential impact of new technologies, such as AI, on its ability to comply with criminal laws?
- Is management of risks related to use of AI and other new technologies integrated into broader enterprise risk management strategies?
- What is the company’s approach to governance regarding the use of new technologies such as AI in its commercial business and in its compliance program?
- To the extent that the company uses AI and similar technologies in its business or as part of its compliance program, are controls in place to monitor and ensure its trustworthiness, reliability, and use in compliance with applicable law and the company’s code of conduct?
- Do controls exist to ensure that the technology is used only for its intended purposes?
- What baseline of human decision-making is used to assess AI?
- How is accountability over use of AI monitored and enforced?
- How does the company train its employees on the use of emerging technologies such as AI?
- If the company is using new technologies such as AI in its commercial operations or compliance program, is the company monitoring and testing the technologies so that it can evaluate whether they are functioning as intended and consistent with the company’s code of conduct?
- How quickly can the company detect and correct decisions made by AI or other new technologies that are inconsistent with the company’s values?
These considerations show that the DOJ requires, among other things, human accountability, controls, testing, and periodic evaluation of AI tools. Notably, these revisions are provided within the larger framework of the ECCP, which establishes that companies should perform risk assessments, craft policy and procedures that respond to the risk assessment, communicate those policies to employees, report and investigate compliance failures and misconduct, and apply the same compliance controls to third-party vendors.
The ECCP also calls on companies to conduct sufficient due diligence in any merger or acquisition to uncover and remediate compliance failures in a target company. Thus, for example, a company with a robust AI compliance program isn’t free from harsh criminal penalties if it fails to impose the same program on a company it has acquired.
Though AI allows for significant cost savings and opportunities for companies, they will need to invest in comprehensive compliance and oversight of AI tools. This investment should be proportionate with the risks introduced by the AI tool—the ECCP asks prosecutors to question, “Is there an imbalance between the technology and resources used by the company to identify and capture market opportunities and the technology and resources used to detect and mitigate risks?”
The Principal Deputy Assistant Attorney General Nicole M. Argentieri, head of the Justice Department’s Criminal Division has stressed that data analytics and new technologies available to businesses should also be leveraged by those businesses to improve compliance, not merely to save costs or increase profits.
Like the prior version, the updated ECCP asks that prosecutors focus on the conduct and responsibility of senior company leaders—specifically whether they promote and foster a culture of robust compliance.
Given that AI’s capabilities and risks are rapidly changing, it’s difficult for companies to predict criminal risk and exposure they may face by using AI in their daily operations. Although even a well-designed and thorough compliance program might fail to uncover and remediate compliance failures or criminal misconduct, businesses should perform comprehensive risk assessments before implementing an AI tool, periodically while it’s being used, and upon any material changes.
Companies also must have a concrete plan to investigate, remediate, and potentially report any criminal activity connected to use of AI with the assistance of outside counsel.
The DOJ considerations overlap significantly with the AI framework requirements in existing domestic and international laws (such as existing comprehensive privacy laws, the Colorado AI Act and the European Union AI Act), as well as guidance in sector-specific model acts, such as those issued by the National Association of Insurance Commissioners and the New York Department of Financial Services.
While we are seeing guidance and legislation emerge in different sectors, the common theme is a requirement that companies perform a comprehensive risk assessment and implement a governance framework that identifies, mitigates, and monitors AI risks through the entire AI lifecycle.
Key Takeaways
Flexibility. The substance of the updated ECCP’s requirements largely remains the same. Rather than imposing completely new obligations, the revised ECCP includes more detail on how existing requirements, policies, and procedures can be adapted to address emerging AI issues. This is an opportunity for companies to leverage existing policies, procedures, and processes and adapt them where necessary to address AI.
Accountability. Companies must be mindful of the ECCP’s overarching purpose: It details what prosecutors should be considering—and therefore what companies should ensure they can demonstrate—in determining whether the company’s corporate compliance program was effective at the time of an offense. Even where a company follows all ECCP guidance, it should be able to demonstrate compliance during an investigation and its response to government inquiries on its compliance program.
Ongoing Obligations. The ECCP updates highlight risks presented by emerging technologies, and the need for ongoing monitoring and updates. Adjustments will need to be made in many areas, such as risk assessments, resource allocation, and ongoing evaluation of new technologies. An effective compliance program must consider how emerging technologies will be addressed regularly in the context of the program.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law and Bloomberg Tax, or its owners.
Author Information
Sergio E. Acosta co-chairs Akerman’s white collar crime and government investigations practice.
Ildefonso P. Mas is partner in Akerman’s white collar crime and government investigations practice.
Christy S. Hawkins is partner in Akerman’s data privacy and security practice.
Jacqueline M. Arango contributed to this article.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.