Law Schools Should Teach How to Integrate AI Tools Into Practice

Jan. 16, 2026, 9:30 AM UTC

Now that artificial intelligence tools for lawyers are widely available, we decided to integrate them for a semester in our Entrepreneurship Clinic. We have some important takeaways for legal education in general and the transactional practice of law in particular.

First, employers and educators need to account for law students who already are using AI tools in their legal work and guide new lawyers about how to use such tools appropriately.

Second, different AI products lead to wildly different results. Just demonstrating this to law students is very valuable, as it dispels the notion that AI responses can replace their independent judgment.

Third, AI’s greatest value may be in refining legal judgment for lawyers in ways that can help new and experienced lawyers alike.

Legal AI Prep

As we were planning our syllabus over the summer, we provided formal training in AI tools designed for lawyers. A librarian provided us an overview of products from Bloomberg Law, Lexis, and Westlaw early in the semester.

Before the training, we asked students how they were using AI in the legal work. Their responses ranged from “not at all” to “I start all of my case law research on ChatGPT.”

We were confident that our students would be better off operating somewhere between those extremes. Over the semester, we demonstrated how AI could enhance the speed and quality of legal work, as well as the dangers of outsourcing research and judgment to an AI tool.

AI Tool Differences

Perhaps the training’s most valuable takeaway was that each tool had access to different databases of materials and had different constraints. We designed simulations that required groups of students to complete the same transactional tasks (drafting, researching, benchmarking market terms, and crafting effective client emails) using various AI tools.

In one exercise, students acted as counsel to a small business owner. The “client” emailed them asking about standard-form contracts relevant to their industry and what pricing mechanics such contracts use.

For the research stage of the task, all teams located a standard-form construction contract, but only half of them found the industry-accepted standard form that we contemplated. The others located this form later by modifying their search approach. This helped to demonstrate some limitations of AI tools.

For the client communication stage, some teams failed to answer the “client’s” questions. This isn’t something the AI tool could address on its own, and it reminded students to constantly refocus on the big picture in addition to individual tasks.

We found that AI tools built on widely available AI platforms such as ChatGPT produced the most responsive outputs and were most forgiving of haphazard prompting. But certain specialized legal AI tools often failed to answer the prompt.

This is a double-edged sword. Although the generally available tools were more likely to generate an answer, they also were more prone to providing unreliable outputs. By contrast, the specialized tools hallucinated much less frequently but regularly stopped short of fulfilling a request if it required work beyond their guardrails.

Delegating Work

Our final takeaway was that AI was surprisingly good at issue-spotting and double-checking a lawyer’s work product. These uses can help both new and experienced lawyers.

We used the idea of delegation to make this point to our students. AI is fast, adaptable, and always available, so it’s a great resource. But you should only delegate work to it when you can verify its output.

In one exercise, students had to issue-spot risks and approaches after a “client” described a business opportunity. Students brainstormed in small groups. There was a lot of overlap, but some groups thought of items that others had not. We added the items to a collective list, relying on our years of practice to guide the students through gaps that remained.

Once we had a strong collective list of items, a team asked an AI product to issue-spot the same scenario. It generated most of the items in our list, some that weren’t relevant, and—most importantly—a couple that no one had raised.

This was a valuable lesson: AI had something to add to our analysis, but we had to exercise independent judgment to determine whether its contributions merited further thought.

Important Takeaways

We asked students for feedback on our use of AI throughout the semester. The most valuable feedback was that they wanted to develop their own legal judgment and learn how and why certain tasks are completed before relying on AI.

This echoes the transition from book-based legal research to electronic legal research. There was some value in searching the law reports in the library, but electronic legal research won out because it was so much more efficient. Yet even with this enhanced efficiency, a responsible lawyer must understand how to build a strong research plan and actually read the cases they cite.

In the clinic, our goal is student learning. It was for this reason that we liked to deploy the AI tools at the end of our exercises: You do the work and then interrogate it with the AI tools of your choice.

Such an approach ensures law students get the benefit of struggling through first repetitions of new tasks while allowing them to generate superior work product with fewer drafts. This process requires discipline. Legal education and legal employers need to clarify the line between AI as a tool versus AI as a crutch.

We learned a lot about how AI tools can help law students develop into good lawyers. As those tools are integrated into legal practice, lawyers of all experience levels should take a self-conscious approach to using them.

This article does not necessarily reflect the opinion of Bloomberg Industry Group, Inc., the publisher of Bloomberg Law, Bloomberg Tax, and Bloomberg Government, or its owners.

Author Information

Robert A. MacKenzie is the Davis Polk & Wardwell Clinical Teaching Fellow at New York University School of Law who co-taught the Entrepreneurship Clinic in fall 2025.

David Reiss is a clinical professor of law at Cornell University Law School and Cornell Tech and research director of the Blassberg-Rice Center who co-taught the Entrepreneurship Clinic in fall 2025.

Write for Us: Author Guidelines

To contact the editors responsible for this story: Daniel Xu at dxu@bloombergindustry.com; Rebecca Baker at rbaker@bloombergindustry.com

Learn more about Bloomberg Law or Log In to keep reading:

See Breaking News in Context

Bloomberg Law provides trusted coverage of current events enhanced with legal analysis.

Already a subscriber?

Log in to keep reading or access research tools and resources.