Deploying AI: Considerations for Companies Before Transformation

Jan. 29, 2026, 9:29 AM UTC

Innovation is the future of compliance. The challenge today isn’t if data, analytics, and artificial intelligence should be adopted, but how?

It’s been more than a year since the US Department of Justice released its 2024 Evaluation of Compliance Programs guidance, and compliance professionals still may be unsure on how to proceed. Compliance professionals should start assessing how to invest in AI and analytics to align with DOJ expectations.

Effective Implementation Ideas

The guidance sets forth three fundamental questions for the evaluation of compliance programs:

  1. “Is the corporation’s compliance program well designed?
  2. Is the program being applied earnestly and in good faith? In other words, is the program adequately resourced and empowered to function effectively?
  3. Does the corporation’s compliance program work in practice?”

From a practical perspective, the question of resources—the ability of compliance programs to use analytics, data, and AI—often must be addressed first. DOJ used the question to highlight the importance of those subjects. Going forward, DOJ explained, it would measure appropriate resourcing in part by evaluating a program’s access to data and use of technology compared with the commercial organization.

By explicitly including this comparison as part of the larger compliance program resourcing analysis, DOJ elevated the use of analytics, data, and AI within programs, solidifying the connection between adopting these solutions and meeting DOJ expectations. But where should a program put its resources to ensure the program is “designed for maximum effectiveness”?

The monitoring space provides a great opportunity, since monitoring for non-compliance or control gaps has always been a key program element. Advanced tools allow for monitoring of transactions or enterprise messaging data in ways that previously were cost-prohibitive. As such, many vendors have developed solutions in this space. For example, DLA has developed a proprietary AI approach that deploys customized models at scale, enabling organizations to surface a substantial majority of documents identifying potential compliance issues from a very small fraction of an overall enterprise messaging data set, with the resulting outputs reviewed by counsel under attorney-client privilege.

Another area where investment could increase effectiveness is the use of chatbots to provide policy information to employees. Business-facing compliance colleagues often spend time answering routine questions. Chatbots can answer basic questions, freeing business facing colleagues to focus on more value-added work.

Another example is the area of program oversight. Senior management or board members can sometimes find it difficult to conceptualize the risks and mitigation efforts of compliance organizations. Creating visualizations of measurable goals for the program based on internal company data can help solve this issue; score cards based on advanced analytics are becoming more prevalent across programs.

Visualizations are customizable and can include specific key performance indicators related to a risk area or paint a picture with general data, drawing on metrics or monitoring results, depending on the programs’ goals. The ability to use analytics to measure and visualize key metrics help board members and senior management, who are accountable for overseeing compliance, get a better understanding on whether day-to-day program activities are meeting goals and are therefore ultimately effective.

Finally, a program’s ability to use data, analytics, and AI can answer the third question in DOJ’s guidance about whether the corporation’s compliance program works in practice. The guidance and other commentary would suggest that the most straightforward way to show this is by detecting inappropriate conduct. But failing to do so in a specific instance, is not fatal as DOJ considers “how the company has leveraged its data to gain insights into the effectiveness of its compliance program” and “whether the program evolved over time to address existing and changing compliance risks.” A company’s ability to show evolution over time based on insights it has gained will support a conclusion that the program works in practice.

In many large organizations, the only way to test that the program is working in practice is to look across the company’s data using analytics and possibly AI. Implementing scalable, innovative monitoring solutions while working with vendors who understand compliance and the risk areas at issue can help programs identify opportunities to refine policies, training, and controls, demonstrating the high value of investing in monitoring tools.

The DOJ guidance requires programs to consider their access to data and investment in analytics and AI capabilities. How a program invests in analytics and AI will determine whether the program continues to meet DOJ’s requirements. The areas of monitoring, policy, and oversight provide opportunities.

However, there are instances when technology partners over-promise but under-deliver because of a lack of understanding of compliance programs or underlying risks. Compliance professionals should consider whether a vendor understands the technology, compliance programs, and the risks that the organization is facing.

By investing in internal data science capabilities, firms can deliver privileged, proactive compliance monitoring solutions informed by compliance program requirements and risk priorities. Using analytics and AI can allow companies to show program effectiveness and help satisfy DOJ requirements—but only if they are successful in implementing those changes.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.

Author Information

Jeffrey Scott is of counsel in DLA Piper’s Philadelphia office and a seasoned investigations and compliance attorney.

Talia Portocarrero is managing director of DLA Piper’s data interrogation team within the firm’s AI & data analytics practice, based in Washington, DC.

Stephanie Gumabon-Greaver is an associate in DLA Piper’s Philadelphia office and focuses her practice on litigation and regulatory matters.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Jada Chin at jchin@bloombergindustry.com; Jessica Estepa at jestepa@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.