Not long ago, most legal teams treated artificial intelligence like a science experiment that was too risky to put into production. That mindset has flipped, with a recent Gartner report revealing that leveraging generative AI is a top priority for nearly 60% of legal leaders this year.
Legal teams are now going beyond just piloting AI and are building it into core workflows. Even large, historically cautious firms are launching formal AI agendas and using tools such as Harvey and Legora across their practices.
The conversation has shifted from “Should we use AI?” to “Which AI tools belong in our stack and how fast can we scale them?” Legal work often involves translating complex legal language into clear, actionable guidance. Whether it’s explaining a contract clause to a salesperson, summarizing regulatory risk for an executive, or preparing client-facing updates, generative AI can do in seconds what once took hours.
Those use cases help explain why adoption has skyrocketed. But they also highlight a bigger issue: many AI tools stop at surface-level tasks instead of tackling the deeper workflow pinch points where legal teams actually need relief.
Here is why so much legal tech misses the mark.
They solve the wrong problem. Many of these new offerings have a common approach: bolt a chat interface on top of documents, policies, or contract repositories and market it as “AI-powered legal tech.” It looks great in a demo where you upload a contract, type a question, and get an instant answer.
The problem: Getting a simple answer isn’t the same as solving a problem.
If you ask a chatbot, “Who needs to approve this contract?” and it replies, “Dave in Accounting,” you still need to track down Dave and secure the approval yourself. This doesn’t make your process faster or more efficient.
This is a common pattern: Tools promise to “automate legal work” but only make it easier to ask questions about documents, without driving actual business outcomes.
They don’t integrate where the work happens. Many point tools live in their own silo. If your sales team works in Salesforce and your legal team lives in Microsoft 365, a standalone contract bot creates friction rather than removing it. If your tool doesn’t fit naturally into existing workflows, you’re going to quickly find that you’ve invested in tech that no one wants to use.
They lack legal nuance and governance. Many of these solutions rely on generic large language models. That creates risk. What if privilege is waived, data security is compromised, or the output is simply wrong? In legal, accuracy and governance aren’t “nice to haves”; they are an essential part of the work.
These limitations explain why so many tools deliver exciting demos but disappoint in production.
Sorting the wheat from the chaff. The best way to find solutions that actually work for your business is to get crystal clear on what you want AI to do. You must focus on outcomes rather than specific features.
Are you trying to speed up contract turnaround times so deals close faster? Eliminate low-value legal review work like routine NDAs? Give business users the ability to answer their own contract questions without looping in the legal team? Or maybe you want to reduce risk by making sure playbook positions are automatically applied and tracked.
When you know the specific outcome, it’s a lot easier to see which tools are worth your time. I’ve seen first-hand where a client set just two criteria: The solution had to live inside Salesforce and handle contract redlining. Those two points alone narrowed down the field by 98% instantly.
From there, ask three simple questions of any potential tool:
- Does it move us closer to the outcome we’ve defined?
- Does it integrate into the systems where we already work?
- Is the output accurate and secure enough for legal work?
If a vendor can’t clearly answer these, they’re probably not going to solve your problem.
What’s Next?
The coming months will separate hype from substance. Every legal team will have access to AI in some form; the differentiator won’t be whether you use it, but how well it’s aligned to your actual business goals.
Generative AI has opened up so many ways to be more efficient, but we have to know exactly what problems we’re solving. Leaders who focus on outcomes first, then pick the right tech to match, will cut through the noise and unlock AI’s real potential.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.
Author Information
Tom Dunlop is founder and CEO of Summize, a contract lifecycle management company with headquarters in Boston and Manchester, England.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.