As generative artificial intelligence becomes increasingly accessible to students since the launch of ChatGPT in 2022, University faculty, administrators and students are all grappling with how to integrate and regulate AI use in classrooms. The University’s response spans faculty training programs, department-level policy setting and a student-led governance experiment, all of which work to contend with the ways in which AI is transforming both academic practice and workforce expectations.
Two years ago, the University launched the Generative AI in Teaching and Learning Task Force with a clear, short-term goal — to assess AI’s role in education and deliver recommendations to the University. This task force included professors from several of the University’s schools and a former chair of the Honor Committee.
Those recommendations have since led to initiatives such as the online course, Teaching in a World of Generative AI, developed by the Center for Teaching Excellence, and the Faculty Ai Guides.
Although the University has developed resources to assist students and faculty in their approaches to AI use, Michael Palmer, a member of the GENAI Task Force and director of the Center for Teaching Excellence, stressed that developing a universal policy around AI at the University is not realistic nor desirable. He said that how AI is used looks different on a case by case basis because the ways AI can be used vary depending on the discipline.
“Some disciplines have really leaned into it. [In McIntire], there's lots of potential around data analysis and forecasting and marketing and those types of things,” Palmer said. “And then [in] other disciplines, say philosophy, where it's really about ideas and discussion, there's a different approach to how AI can help in those spaces.”
For students, however, differences in policies can create confusion. Second-year Commerce student Claire Clark said that while some professors explicitly outline expectations, others do not mention AI at all, leaving students uncertain about how — or whether — they are allowed to use it.
Palmer believes that clarity in each class is critical and that instructors should explain their rationale to students. But creating explicit policies can be difficult, he added, given the pace of AI’s evolution.
One solution has been the Faculty Ai Guides which is a network established in 2024 and composed of 53 faculty members from 10 schools across the University who have been trained to serve as departmental resources on AI. In this application-based program, faculty guides attend a one-day course for teaching and learning with AI. Guides continue meeting in small groups throughout the year, covering AI literacy, pedagogy, policy development and assignment design. These AI guides also lead workshops, presentations and/or consultations, offering support to faculty, whether they want to integrate AI into their teaching or limit its use.
Each guide is supposed to serve as an ongoing source of knowledge, and is responsible for being as up-to-date as possible about recent advancements and uses of the technology.
Kiera Allison, faculty Ai guide and assistant professor at McIntire, said her role combines learning, sharing knowledge and helping faculty adapt policy to their curriculum.
“[Faculty are] not going to know who’s using AI, so we have to find ways to motivate students to do their own work,” Allison said. “It’s fine if you don’t want your students using AI, but you have to be able to tell them why … because the tool is available.”
Reza Mousavi, AI Task Force member, faculty AI guide and associate professor of commerce, frames the University’s approach as preparing students for a future that is both more technologically advanced and more deeply human. In McIntire, that means recognizing AI’s analytic power while emphasizing the human creativity and ethics that turn data into strategy.
“We’re not just teaching students how to use a specific AI tool. That would be like teaching someone a particular model of a calculator that will be obsolete in a year,” Mousavi said. “We are teaching students how to think, question and create with AI as a partner.”
Economics professor Anton Korinek also pointed to the calculator as a useful analogy for AI’s role in education.
“When you go to primary school and you learn what multiplication is, you’re not allowed to use [calculators] because otherwise, you wouldn’t learn basic algebra,” Korinek said. “But at higher levels, of course we use them, because otherwise it would be a gigantic waste of time.”
He noted that the rapid pace of AI’s development makes adaptability as important as technical knowledge. He thinks that all students hold a new responsibility of being “AI-literate” and believes that students should focus on integrating AI into whatever they are studying.
“Ideally, every student should be their own AI guide,” Korinek said. “One of the most valuable things that [professors] could do is … educate them [on AI] … Let's stop with the abstract discussions, and let's just make everybody who hasn't [used AI] significantly yet … have conversations with these systems about what they can do.”
Graduate Batten student Ella Duus believes students should have a direct role alongside faculty and administrators in shaping how AI is used in classrooms. She is helping to lead the creation of a Student Technology Council at the University alongside Mona Sloane, assistant professor of Data Science and Media Studies.
The council aims to formalize student participation in technology governance — including AI policy — through listening sessions and participatory design workshops that determine its structure and scope. Supported by the office of Online Education and Digital Innovation and the Karsh Institute of Democracy, the project will also produce a research-based blueprint other universities can use to launch similar initiatives.
Duus noted that if AI use is not directly prohibited, many students feel compelled to use it to remain competitive, even if they would prefer to work without it. Clark felt similarly that even without clearcut policy students are missing out if they do not take advantage of the tool.
“It’s a really valuable resource, and if you use it correctly, it can just really improve any work that you’re doing yourself. So if other people are using it and you’re not, you’re just putting yourself at a disadvantage,” Clark said.
This dynamic, Duus said, makes it critical for students to help define policies that balance AI’s potential benefits with the preservation of skills it cannot replicate, such as long-term planning, original idea generation, sound judgment and emotional intelligence. Without student input she says policies risk overlooking the lived realities of how AI is influencing academic choices.
"Anytime the University is soliciting feedback or comments on AI use, even if it's not explicitly from students, take the time to write an email to fill out the form and make your voice known," said Duus. "Every investment that students make now [in] deliberating around AI use is only going to serve them in the long run."
Allison also emphasized the need for education to adapt to new AI technologies.
“We do not exist in a vacuum where we can pretend that education is what it was four years ago,” Allison said.