Over the last several months, I’ve met legal leaders from Europe, North America, Australia, and Asia who are deeply focused on advancing their teams’ artificial intelligence capabilities. They’re moving with urgency and intelligence, trying to build legal functions that are faster, smarter, and more resilient.
Yet despite their investment and intent, a familiar pattern keeps emerging: Pilots stall, adoption lags, and material impact remains elusive.
Fluency, not functionality, is at the root of this. But legal leaders are getting a few things wrong.
They think access equals readiness. According to Factor’s GenAI in Legal Benchmarking Report, 61% of legal departments provide AI access to most or all team members. Yet fewer than 19% feel very confident using those tools. One in five say they “really need help.”
This isn’t about installing software; it’s about building capability. Having access doesn’t mean lawyers know how or when to use it.
They design for the top 5%. Most programs are built around early adopters. But in every legal team, most are curious but cautious.
Widespread adoption requires widespread upskilling. Legal teams need training that meets people where they are, builds confidence, and supports real fluency across the broader group.
They treat generative AI like traditional legal tech. Many teams approach GenAI pilots with expectations shaped by past systems—static, rules-based tools that either worked or didn’t. But GenAI is different because it’s more accessible and doesn’t require programming skills.
It’s also more demanding. It requires users to frame problems clearly, engage with the output, and guide the tool toward useful results.
When teams expect instant polish or one-click answers, they miss the point. GenAI tools evolve through interaction. Early drafts need context and refinement.
They expect performance without orchestration. Even the best AI systems need context, workflow design, and human input. Without orchestrating tools, people, and processes, impressive capability goes underused.
They run pilots without changing mindsets. Lawyers are trained to eliminate ambiguity. GenAI introduces it by offering possibilities, not certainties.
AI’s outputs depend heavily on context and often require interpretation. For many lawyers, this feels unfamiliar and uncomfortable.
Without cultural change, even the best pilots won’t scale. Success depends on helping legal teams understand how GenAI works, where its boundaries are, and how to confidently use and challenge its outputs.
Preparing Your Team
Unlike earlier waves of legal tech, GenAI tools can’t be reduced to simple training modules or system rollouts. Real adoption depends on building new skills, new instincts, and a new mindset across the team. Here’s where we’ve seen real progress.
Develop an AI mindset. Legal teams need to build foundational understanding. This means helping lawyers understand how to think alongside the tool—not just how to use it.
That includes knowing what GenAI can and can’t do, where it fits into legal workflows, and how it shifts the way certain tasks can be approached. Legal teams must identify relevant use cases, understand when and how AI supports decision-making, and feel confident assessing its risks.
Teach interacting with AI as a core skill. AI fluency requires active engagement. Prompting is the primary means of interaction, but this is as much a shift in mentality as it is a formulaic prompting technique.
You can’t get value from GenAI by learning a few commands or keystrokes. These tools require legal professionals to engage more creatively and intentionally.
To use GenAI effectively, legal teams often need to unlearn assumptions they’ve built from other technologies. These tools aren’t like Google, where a quick keyword search brings back a ranked list of links. Nor are they like the rigid systems lawyers may have encountered in legacy legal tech.
Structured prompting teaches lawyers how to brief AI the way they would brief a colleague—by clearly defining the task, setting boundaries, and providing the right inputs. Better prompts lead to better results, and this skill is one of the fastest ways to improve performance and trust the tools.
Skill your team to stress test and verify AI outputs. Lawyers don’t need to be engineers, but they do need to know when to trust AI and when to challenge it. Teach teams to interrogate outputs, check for and mitigate the risk of hallucinations, and flag weak assumptions.
Prioritize real integration into daily work. Focus on embedding AI into tasks teams already do: summarizing agreements, triaging intake, surfacing risk patterns. That’s where momentum builds.
Equip legal teams to think like builders. GenAI adoption in legal is less about delivering a system and more about building new ways to approach legal tasks. The legal teams making the most progress start by understanding how GenAI works, then go further by applying it with creativity, critical thinking, and real-world judgment.
Lawyers don’t need to code, but they must understand how AI fits into their workflows. Leading teams treat legal use cases like product ideas: They define needs, prototype solutions, and iterate until they work.
This builder mindset helps legal professionals move beyond passive usage. It gives them a stake in the outcome, increases the chances of long-term adoption, and creates a culture of continuous improvement.
Fluency Differentiates
The legal teams making progress aren’t doing so through headline-grabbing pilots or chasing the latest large language model. They’re building fluency deliberately—and from the ground up.
That fluency is made up of specific, learnable skills that reshape how legal professionals think, solve problems, and deliver value:
- Practicing AI on realistic, legal-specific tasks in a safe environment
- Learning to prompt with structure, context, and clarity
- Stress testing outputs for risk, bias, and hallucination
- Embedding AI into routine workflows, not hypothetical ones
- Collaborating across legal, ops, IT, and business to drive adoption
Success won’t hinge on the flashiest pilot or the newest model. It will come from the legal teams that build fluency where it matters most: in the everyday work.
This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.
Author Information
Chris DeConti is chief strategy officer at Factor and advises legal departments and law firms globally on AI adoption, legal operations, and innovation strategy.
Write for Us: Author Guidelines
To contact the editors responsible for this story:
Learn more about Bloomberg Law or Log In to keep reading:
Learn About Bloomberg Law
AI-powered legal analytics, workflow tools and premium legal & business news.
Already a subscriber?
Log in to keep reading or access research tools.