Questions about adopting generative AI in the workplace have moved quickly from the hypothetical to the “how.”
Companies want to determine appropriate standards and guardrails for using the technology, avoid legal risks, and capitalize on AI’s benefits. Nearly two years after ChatGPT kicked off the frenzy, the challenges have become much clearer.
“A lot of people are trying to figure out how to go from principles to practice,” said Evi Fuelle, global policy director at Credo AI, an AI governance platform.
These questions are top of mind for leadership in corporate legal departments.
1. How do I comply with AI laws?
For many companies, Step 1 is figuring out how to comply with the European Union’s sweeping Artificial Intelligence Act—the most fully-formed AI law in force today.
Many companies are applying the EU requirements for high-risk systems across the board, said Philip Dawson, head of global policy at Armilla AI.
“They may want to have one process across their company,” he said. Broadly applying stricter rules simplifies the tracking of AI system risk classifications, which can change according to how the systems are used.
Additionally, a company may approach regulatory compliance differently if it’s using third-party tools or developing its own, said Shelley McKinley, chief legal officer at GitHub.
“Using AI tools is often just a matter of having a sensible, easy-to-understand internal use policy,” she said. But for companies developing AI tools, she recommended investing in a subject-matter expert to track regulatory requirements and adapt company data governance policies as needed.
2. Should I commit to voluntary standards?
Some companies are looking to voluntary standards and certifications to guide them before mandatory rules come into play.
Ryan Donnelly, co-founder of the AI governance platform Enzai, said he’s often asked about the International Organization for Standardization’s ISO 42001, which outlines how to set up internal policies and procedures to best manage AI risks.
Craig Shank, a consultant on AI ethics and formerly VP of corporate, external and legal affairs at Microsoft, said ISO certification could be even more valuable if the biggest customers of AI begin demanding it.
“I think it’s as protective as just about anything you can do from a liability perspective, if for no other reason than it makes it much faster to unwind a problem if you’ve created a mess,” he said. Even if a company doesn’t obtain the full ISO certification, he said, it’s “a useful grounding point to say, ‘We’re making deliberate decisions,’” including documenting how a decision about AI was made.
3. How do I vet my vendors?
At GitHub, vetting AI tools starts with the same questions the company would ask about any other tool, looking at issues like, “vendor reputations, data handling, security and privacy protections,” McKinley said.
“In a lot of respects we approach these tools like we’ve long approached any third-party tools we use: Know your vendor, their reputation, and the assurances and transparency they provide about their offerings.”
Contracts are an important tool to protect companies from legal risks involving third-party AI vendors, but shouldn’t be the only one.
“If contracting around risk is your only method of risk mitigation, you’re in trouble,” said Devika Kornbacher, co-chair of the global tech group at Clifford Chance.
4. How do I get my board on-board?
How general counsel talk to their boards depends in part on how much AI knowledge members have, and many won’t have a member who’s an AI expert, said Rob Chesnut, Bloomberg Law columnist and former general counsel and chief ethics officer at AirBNB.
“For me, as a GC, I’m looking at my matrix,” or how the roster of corporate board members aligns with key areas, Chesnut said. “I’m deciding whether AI is, in fact, a fundamental quality of expertise that we need on the board.”
Salesforce chief legal officer Sabastian Niles said boards would do well to devote a committee specifically to tech issues.
For the largest multinational companies and others that need to grapple with this broad set of issues, Niles said, “You’re going to have to take a fresh look at how you want to organize the work of your board.”
5. How can AI make my job easier?
Of course, AI isn’t just a potential risk. It’s supposed to make in-house legal teams more efficient.
Microsoft’s legal department is using its own and third-party tools for transactional processes, legal advisory services, and regulatory compliance and risk management, said Hossein Nowbar, the company’s chief legal officer.
The department assesses tools based on “effectiveness, including the tool’s ability to drive cost savings, improve speed and accuracy of legal tasks, and enhance overall productivity,” Nowbar said.
However there are no commonly accepted, standardized metrics for evaluating legal AI tools on qualities like accuracy and time saved.
Chris Audet, chief of research in Gartner’s assurance practice, advises clients seeking a boost from large language models to identify and target workflows where productivity gains of more than 10% are possible.
6. Where does liability for AI harms lie?
When an AI system goes awry and someone gets hurt, who’s responsible? The company that made the technology, or the one that used it?
The answer is murky because so many hands go into making an AI product that it isn’t clear how much influence each party has on the eventual outcome.
However, courts, legislators, and government agencies are beginning to chip away at tricky questions about where liability lies when AI leads to harms.
“The default for a long time is going to be, ‘Buyer beware—caveat emptor,’” said Thomas Magnani, a partner at Arnold & Porter. The customer understands the risks of using the tool, and is using it to assist their work, not replacing their own judgment, he added.
Read the full version of this story in a deep dive report for in-house counsel: Artificial Intelligence: The Next Phase
To contact the reporter on this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.