The UK’s University of Oxford is encouraging its Economics and Management undergraduates to employ AI tools like ChatGPT for drafting essays – no, that’s not a typo.
Professor Steve New, who is teaching a module on technology and operations management, encourages students to use AI as a tool to improve writing and critical thinking, stating, “AI should help you produce a much better essay than you would produce unaided,” provided it’s used “thoughtfully and critically.”
Students are instructed to rigorously fact-check their AI-generated drafts, as the course material points out the tendency of AI to fabricate facts and references – also called hallucinations.
In further guidance, Professor New emphasizes the role of AI in supporting intellectual conversations, stating, “AI should increase your ability to think hard about the subjects you discuss, and make you more confident in framing a clear and persuasive argument.”
Professor New further highlights the importance of personal accountability in academic writing, explaining, “But the document that emerges should be yours… The AI can produce humdrum ‘some say this, some say that…meh’ essays in a fraction of a second; you should be producing compelling, tightly-argued, evidence-based prose that you believe in.”
Creating an essay that is well and truly authentic while being partially guided by AI might be more complex than simply writing it oneself, however.
Oxford University acknowledges AI’s influence on our opinions, cautioning that “AI might – without you realising – steer you towards particular intellectual or ideological positions.”
Students must also include an “AI statement” documenting how they used AI tools to create their essays.
A large paper from educational leaders and teachers published earlier in the year discussed AI’s role in learning.
Giampaolo Viglia from the Department of Economics and Political Science, University of Aosta Valley, Italy, stated, “The advent of ChatGPT – if used in a compulsive way – poses a threat both for students and for teachers. For students, who are already suffering from a lower attention span and a significant reduction in book reading intake the risk is going into a lethargic mode.”
Educators participating in the study often accepted AI’s inevitable role in academia but admitted that critical thinking is threatened.
Sven Laumer from the Institute of Information Systems Nürnberg, Germany, said this must be preserved at all costs, stating, “When it comes to college essays, it’s more crucial that we teach our students to ask important questions and find ways to answer them.”
He continued, “This is the intellectual core that will benefit both the students and society. Therefore, we should place a greater emphasis on teaching critical thinking skills and how to add value beyond AI.”
To use AI or not to use AI?
Other studies and research have indicated that students use AI regardless of whether or not they’re expressly permitted to do so.
Yet, they’re also worried about when it’s considered ‘cheating’ if the school or university doesn’t have policies on AI usage.
Educational institutions are scrambling to produce boundaries and policies for AI usage, but there are few hard and fast rules.
Some institutions, like Oxford University, are cautiously embracing AI. The Russell Group of universities has committed to making students “AI literate,” acknowledging the growing relevance of AI skills for future employment.
Harvard released a coding assistant for its computing course, admitting that AI is replacing the need to write boilerplate code.
However, there’s palpable confusion about the boundaries for AI usage. The University of Oxford, for example, categorized unauthorized use in exams and assessed work as a “serious disciplinary offence.” It’s a message easily muddled when teachers start advocating for its use.
There probably won’t be any consensus on this soon, so students will need to keep their eye out for when, or if, they should use AI in their studies and work.