AI Demands Attention From Corporate Boards to Avoid SEC Scrutiny

Jan. 6, 2026, 9:30 AM UTC

Artificial intelligence has quickly outgrown its origins as a fun IT experiment. It now sits inside the operations, financial processes, marketing, risk models, and customer-facing systems of most companies. That also puts it squarely within board of directors fiduciary and oversight obligations.

That should set off a quiet alarm for boards that are paying attention that sooner or later, the Securities and Exchange Commission will want to understand how companies are governing and describing their use of AI.

Boards don’t need to become experts in machine learning. But they do need to demonstrate informed, structured oversight of a technology that can accelerate both value creation and impairment.

Familiar Pattern

Every few years brings a shiny new source of systemic risk, and public companies have a well-worn habit of falling behind the curve. The sequence is almost predictable: The technology gains traction, and investors push for transparency.

Companies then get ahead of themselves in published statements while plaintiff’s lawyers and the SEC begin testing and challenging those statements. As a result, corporate disclosures and governance structures get rebuilt under pressure.

We’ve seen this cycle play out with Y2K, perks cybersecurity, Covid-19, special purpose acquisition companies, climate, crypto, and environmental, social, and governance. AI presents another turn in that cycle, but boards have the opportunity to break it—if they act before the scrutiny arrives.

Practical Oversight Framework

Effective AI governance doesn’t require directors to decode neural networks. It does require three things: literacy, strategy, and structure.

Boards can’t oversee something if they don’t understand its fundamentals. They don’t need technical mastery, but they do need context.

Directors should ask a number of questions including:

  • What education or briefings will help us understand AI’s capabilities, limits, and risk. Should we hear from internal teams, outside experts, or both?
  • Should AI and emerging technologies experience be a factor or a priority in board refreshment and management succession planning?

AI should also be treated as a strategic imperative. Boards need to understand how AI is embedded in the organization management’s plans to expand its use. For example:

  • They need to know what safeguards govern the organization’s use of generative AI, especially around the use of confidential data or the completion of commercially sensitive tasks.
  • They need to ask how the company is measuring return on AI initiatives and whether external messaging—to customers, employees, and investors—accurately reflect the company’s actual capabilities.
  • They must weigh whether the adoption of AI by the company’s competitors puts pressure on the company’s business mode.

Establish Clear Governance

AI oversight shouldn’t occur in a vacuum or arrive on an agenda in reaction to an incident. Boards should define in advance whether the board or management has primary oversight of AI.

If the board has primary AI oversight, should that oversight responsibility should be delegated to a committee, and to which one.

They must also decide which AI matters must be elevated to the full board. Other issues to consider are management’s reporting and the type of information the board needs to monitor AI-related performance, risks, and incidents.

How AI oversight connects to enterprise risk management, internal audit, and the disclosure committee also must be considered.

Influencing Disclosure

The materiality framework of security disclosure laws means AI doesn’t need to wait for a specific SEC rule to become a disclosure issue. Depending on the importance of AI to your organization, it will seep into existing frameworks the same way cybersecurity and climate did.

Good governance facilitates compliant disclosure. Companies that understand their AI footprint will be able to speak about it candidly and credibly, and avoid the hype-to-litigation pipeline.

Key pressure points include:

Management discussion and analysis. Does AI materially alter demand drivers or cost structures? Does it change how the company competes?

Risk factors. Does AI create new third-party dependencies or operational vulnerabilities? Are current cyber protections sufficient for AI-enabled threat vectors?

Internal controls. Do AI models influence financial inputs, value assumptions, audit workstreams, or forecasting tools? What testing and validation ensures those outputs are reliable?

The Oversight Imperative

Regulators and investors will be watching how companies “talk AI”—and whether that narrative aligns with reality. AI is moving quickly enough that gaps between aspiration and execution can form in weeks and months, not years. Those gaps will be the subject of enforcement actions and shareholder suits.

Boards that engage early will strengthen credibility with investors, reinforce resilience in their operating models, and be ready when the SEC inevitably sharpens its expectations. Boards that don’t will find themselves trapped in the familiar post-hoc cycle: comment letters, restatements, and uncomfortable and costly litigation.

AI is no longer an abstract topic for future agendas. It’s a live governance test—and boards that approach it with clarity, discipline, and a healthy dose of curiosity will shape how public companies navigate the next wave of technological change.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.

Author Information

Frank Esposito is a partner with Squire Patton Boggs who advises boards of directors and in-house legal teams on public company governance, securities and strategic matters.

Bryn McWhorter is an associate with Squire Patton Boggs who advises public and private companies on corporate governance and securities, investment funds and mergers and acquisitions.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Bennett Roth at broth@bgov.com; Jada Chin at jchin@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.