We invited faculty members in the 2024-2025 Purposeful AI Working Group to reflect on their experiences building custom chatbots that align with their teaching values and enhance student learning.
What type of chatbot did you create for your project?
I created a chatbot to help out at a point where I had been experiencing some problems in my classroom. I have my students research and write an advocacy brief on First Amendment issues. They have to research cases within the relevant jurisdiction in order to support the arguments they’re going to make. I’ve found that students sometimes lose important foundational lessons like “What is mandatory authority in my jurisdiction? What is persuasive?” They have a difficult time choosing cases that are both relevant to their specific problem and most persuasive in support of the arguments that they are making. In previous years, I gave students a pre-quiz to test their understanding of and refresh these skills, but this time, I decided to use a chatbot.
I have students create a log of the cases that are potentially helpful for them, indicating which ones are relevant for the arguments that they are going to make in their brief. Before they handed in their log, I wanted students to have a conversation with the chatbot I created to ensure that their research was thorough. I then asked them to provide a reflection on their use of the chatbot, including the extent to which they thought it was helpful.
How did you feel the experience went for you and your students?
Despite working closely with Kyle Fidalgo, the Academic Technologist at Boston College Law School, I still didn’t feel 100% comfortable with being able to control it. Even though we gave it very strict parameters, the chatbot still would provide information outside of those set parameters. However, I think it’s promising as a reflection tool. I think the issue that I have in my class is that my students use legal research tools like Lexis and Westlaw, which are increasingly integrating GenAI into their operations. Those tools are so specific to the work that students do, and I feel as if my time may be better spent with those tools rather than creating my own chatbot. However, I plan to continue to experiment with using chatbots in other parts of my course.
How are you seeing GenAI affect your discipline?
A huge part of my class is about legal research and writing, which is an area where the rules are rapidly changing about attorneys’ use of AI, often moving in the direction of increasingly permitting the use of these tools. At the same time, there are still some courts that are saying either you can’t use GenAI, or if you use it, you have to disclose its use. So, we need to teach our students how to find the court rules and identify those that govern their use of AI. I also need to teach my students how to evaluate AI output, and I can’t do that until I teach them foundational legal problem-solving skills, including legal research and writing skills.
What do you want other faculty to know about what you’ve learned from your experience being in the Working Group?
I think a large part of my job right now is to teach students how to critically evaluate AI output, and I think other faculty should understand how important this ability is. I’d also want others to know that students are generally pretty enthusiastic about GenAI use. They’re coming in already using these tools, and they’re enthusiastic about learning more. Also, there are so many opportunities for improving access to justice with the use of this kind of technology. There are so many people who don’t have access to lawyers, and there are so many ways that AI can help fill this access gap. I see opportunities for this technology to make it simpler for people to interact with the legal system, including creating tools for lawyers that will lower client costs, as well as tools that will help make it easier for people to represent themselves.