This past fall semester, the University’s Cognitive Science-Based Learning Hub introduced artificial intelligence avatars as teaching assistants in a structured, six-session study strategies program for first-year students, delivered through both in-person mentoring and pre-recorded online lessons. These avatars deliver six pre-recorded sessions designed to condense a day’s worth of learning into two hours. This model, at first glance, appears innovative, efficient and aligned with the University’s broader push toward AI integration.
But beneath the promise of efficiency lies a more pressing question — what is lost when teaching becomes automated? For students navigating their first year at the University, learning also includes the relationships built with older mentors who were once in their shoes. It is about hearing from someone who has already walked the path, reminding you that you belong here and can do this — an emotional and relational dimension of learning that artificial teaching assistants cannot meaningfully reproduce.
Within the Learning Hub, students can complete the program either through in-person peer mentors or through AI-avatar-led online modules. In-person mentors guide first-years not only through study strategies but also through the emotional reality of college itself, including how to recover from a bad exam, how to ask for help and how to keep going when confidence falters. Conversely, AI TAs can deliver information but cannot respond to hesitation, confusion or fear in real time. Faced with pushback from students, they often rephrase material without addressing the root of a student’s misunderstanding. Rather than seeking guidance from individuals who have taken the course, struggled through it and persevered, students are pushed to rely on systems that feed them summaries or incomplete information on a given topic. By removing human guidance and encouraging engagement with AI, students are ultimately discouraged from engaging deeply with material and instead learn to outsource their thinking, mistaking efficiency for understanding.
During a student’s experience at the University, it is more likely that they will benefit from the wisdom and guidance of a considerate and understanding human TA than from a computer. Human TAs develop an intimate, situational understanding of how students learn and apply knowledge productively, not just what they are expected to know for a single exam or assessment. Over the course of a semester, they learn who struggles with time management, who freezes during exams and how to explain concepts in a variety of ways to cater to different students. As students gain clarity and confidence, TAs simultaneously refine their own teaching practices, learning how to adapt, communicate and care — doing all of this in ways no automated system can replicate.
The primary appeal of these AI avatars, beyond broader efficiency considerations, is their ability to scale instruction by providing on-demand study strategy guidance without the limits of human availability. While administrators may prioritize scalability and capacity, the effects of AI instruction is felt in the quieter, safer spaces where learning actually happens. This is largely because teaching and mentorship are not transactional exchanges. They are built on trust, comfort and confidence — qualities only a human can truly provide. Human TAs, whether undergraduate or graduate, form relationships with students over time and establish trust between the teaching team and students. By introducing pre-recorded lessons led by a machine, dialogue, spontaneity and mutual trust are eliminated — removing the very elements that make learning feel safe.
Learning is not merely presenting “correct” strategies to students. It is the experience of sharing knowledge and learning through conversations and different perspectives — a form of experiential and motivational mentorship which cannot be found in interaction with AI. A teaching assistant’s value lies not in perfection, but in relatability by hearing someone say, “I struggled with this too, and here is what helped,” or “Exam two was hard for everyone, but the professor usually offers extra credit before the final.” Beyond this, AI avatars cannot sustain the relational and emotional labor that learning requires — the ability to recognize doubt and help students regain confidence in real time.
Human teaching assistants are also a crucial site of growth for the assistants themselves — a form of learning that AI avatars effectively outsource. Teaching encourages TAs to revisit material from the perspective of the educator, translating abstract concepts into language that is accessible, intuitive and flexible for a less knowledgeable audience. When the University replaces these roles with AI avatars, they eliminate one of the most meaningful opportunities for peer educators to learn how to teach, mentor and care for others. This choice ultimately undermines the institution itself — the University cannot claim to cultivate future educators, clinicians or scholars while actively dismantling the pathways through which those roles are learned.
A classroom, or even a small Learning Hub session, is a shared space of vulnerability and growth. It is a place where students ask “stupid” questions, admit confusion and learn through sharing ideas. If institutions claim to value student growth, mentorship and community, they must invest in the people who make learning human.
Ayat Younis is an opinion columnist who writes about academics for The Cavalier Daily. She can be reached at opinion@cavalierdaily.com.
The opinions expressed in this column are not necessarily those of The Cavalier Daily. Columns represent the views of the authors alone.




