During the brainstorming stage for this editorial, the Editorial Board thought that we might use artificial intelligence to write the first paragraph of this editorial. We gathered a few past editorials, fed them to the infamous chatbot and asked it to write the first paragraph of this editorial — just to see what would happen. The result? Not only did it give extremely detailed feedback on our past writing — thanks, ChatGPT — but in a few seconds, it wrote a paragraph shockingly similar to what the Editorial Board would have written. Needless to say, we all had existential crises. What this exercise proved, however, is that generative AI can be a useful tool for learning. Instead of fearing or ignoring this new wave of advancements, the University should embrace AI-based technology to move education forward and stay ahead of any problems that it may cause.
Generative artificial intelligence is a term used to describe technology like ChatGPT that can be used to create original content in response to inputs from users. Since its arrival in the mid-20th century, AI has become increasingly gifted, mastering standardized tests, essays and math problems with ease — it can also produce arguments and even create art. As a result, AI is now at the center of a lively debate among administrators, students and professors who are struggling to discern what role, if any, AI should play in learning. Leaders in education are not just talking — they are acting. For example, the International Baccalaureate program is allowing the use of ChatGPT in essays, whereas Washington University has included the use of AI under the definition of plagiarism.
For better or for worse, AI is here. The University should utilize it to ensure that students are learning the most relevant skills in the most applicable way. Though there are certainly hiccups in the technology, AI has showed us that many of the skills a liberal arts education is intended to enhance — parsing difficult texts, for example, or crafting clear and concise arguments — are now accessible to anyone with broadband access and an email address. Expanding education through AI-based technology will require students and educators to adapt to a changing technological landscape — this could look like utilizing generated essays to test argument-building skills or curating personalized study guides based on students’ previous work. Many institutions within the academic space have already begun using AI to enhance student learning — Notion has adopted generative AI to summarize notes while Khan Academy now has an AI assistant for students and teachers.
Professors can also benefit from implementing AI. The tool can be used to create examples and lecture slides that are tailored to the unique needs of students — easing the burden of lesson planning for professors and making class more effective for students. University administrators could use it to organize applications, create personalized advising outlines or track students’ improvement over time. AI has a virtually limitless array of useful applications. As the world progresses, so must education — or we run the risk of learning obsolete skills in an obsolete manner.
The Editorial Board also recognizes the risks, however, of welcoming such an advanced tool to the University. As much capacity as AI has to enhance our learning as students, it has similar potential to undermine the academic integrity embedded in our Community of Trust. Since at least January, the Honor Committee has acknowledged the increasing usage of AI among students. We must have conversations about what our school should look like if this technology is integrated into our academical village. The University's newly founded Generative AI in Teaching and Learning Task Force is a step in the right direction as it will allow initial input from the community about concerns surrounding AI’s use in teaching and learning. Once input is gathered, the Committee should strive to create AI guidelines so that both students and faculty can navigate the technology without confusion. AI is too widespread and admittedly too intelligent of a resource to simply be a free-for-all.
To this end, it would be wise for the University to subsidize the cost of AI for all students by partnering with ChatGPT or a similar program — tailoring the tool to fit the University’s needs. A partnership of this sort would alleviate the inequity AI programs create by hiding behind paywalls and also allow the University to regulate how AI is being used on Grounds. The University has historically allowed current students access to otherwise expensive subscriptions like The New York Times or The Washington Post. Disseminating access to AI technology should be treated in a similar manner. We lack the power to stop the spread of AI and, when used correctly, the technology can have immensely positive impacts on the classroom learning experience — so providing students access to AI and then regulating its usage seems like an effective plan of attack. Ultimately, platforms that can significantly enhance the education students receive at the University should never be a tax bracket luxury.
If you think that this editorial is terrible, an AI platform wrote it. If you think that this editorial is well-written, a member of the Editorial Board wrote it. Either way, AI has become an irrefutable factor in education, and the University should face the growing benefits and understandable concern around AI head-on. Being proactive will not only prevent problems concerning honesty and productivity, but it could also curate a more useful and fulfilling education for University students. AI is the future, whether we like it or not — let’s use it to our advantage, because we surely cannot run from it.
The Cavalier Daily Editorial Board is composed of the Executive Editor, the Editor-in-Chief, the two Opinion Editors, their Senior Associates and an Opinion Columnist. The board can be reached at firstname.lastname@example.org.