AI is increasingly important in law firms, but does it have a place in law schools?

In the legal profession, the impact of generative artificial intelligence (AI) tools like ChatGPT remains uncertain. Law schools and faculty members are currently grappling with the decision of whether to allow students to use these AI applications in their coursework. While AI is already deeply integrated into legal practice, generative AI goes beyond traditional tools by learning patterns from data to create new content, including research, summaries, and even contracts. However, many law schools lack explicit policies on these emerging technologies, leaving individual instructors to determine their use in assignments and exams.

Law professors like Patrick J. Keenan at the University of Illinois College of Law and Daniel W. Linna Jr. at Northwestern Pritzker School of Law are actively engaging with generative AI in their teaching. Keenan is finalizing a policy that allows students to use generative AI for idea generation and grammar checks but restricts its use for completing entire assignments. Linna encourages students to explore generative AI tools, emphasizing the importance of lawyers understanding AI capabilities and limitations. However, there are concerns about students misinterpreting AI-generated content, potential overreliance on AI tools, and the impact on legal creativity and problem-solving skills.

Despite these challenges, Keenan and Linna believe that generative AI can benefit law students by improving their educational experience and potentially levelling the playing field for students with diverse backgrounds. They envision AI-powered teaching assistants and tools that can assist students in navigating complex coursework. Moreover, they see generative AI as a tool that can help legal professionals deliver better legal services, particularly in addressing the civil legal services gap among vulnerable communities, thus advancing the mission of law and enhancing access to justice.

Read the full story here.

Brought to you by ICLR.