The use of ChatGPT in an academic setting has been controversial amongst faculty. Some professors support the use of this technology, while some oppose it. The Medium spoke to Professor Christopher Eaton to gain some insight on the matter.
“A lot of the resistance that exists around technology is that it can be used as a shortcut to get things done quickly, which yes it can, but that it is also going to replace the thinking process […] To a certain degree I believe that is true . There is a risk that you could input certain things into a chat bot and get something relatively quickly that is possibly more coherent than your natural writing style,” explained Professor Eaton.
The risk of plagiarism is always present when artificial intelligence tools are at students’ disposal, but Professor Eaton emphasized the opportunities and improvements it can bring. “I think there [are] also lots of opportunities here to [look] at text design differently,” stated Professor Eaton, noting that digital tools have been used at the university, but they have not been used for knowledge-building processes. He continued, “It could be a shortcut, […] it does not replace the actual thinking and putting your ideas into a structured form. It just is a tool to support that.” Professor Eaton mentions “grunt work” that such tools can help with, such as handling citations, grammar, and copy editing. When considering the utilization of ChatGPT in academia, the balance between efficiency and authenticity continues to be the main concern.
At the end of the day, students hold the important role of using the tool correctly, asking themselves how they are benefiting and enhancing their learning. “Is it bypassing your learning? Is it overriding your own understanding of things? If you’re relying on the chatbot to write something, who is actually the writer and who takes responsibility for that,” explained Professor Eaton, stating that whether students view these new devices as a tool is the most important question to ask.
As students, we can comprehend that using ChatGPT to pump out an essay at the last minute is not the right approach. The way you think about the technology and your understanding of exactly what you want to come out of it is vital. Students need to tap into what skills they already have, and are bringing into the technology, and deliberate exactly what they want to come out of it.
Professor Eaton explained that as a professor, there are times when a student has clearly used a chatbot to write their work for them. Such students misused the technology as a shortcut instead of something helping to iterate their own ideas and writing.
“A lot of the people who are most resistant seem to also have an assumption that we can detect this […] To a certain degree maybe that’s true, especially in its most blatant forms […] I hear many people say they know when AI is used, but I don’t know what they know,” explained Professor Eaton. When AI tools are used properly, it is hard to detect the use of them. And is it necessary to catch students who use AI tools? Ultimately, Professor Eaton notes that such tools are already present in the workplace, and society. As an educational institution that prepares students for future work, it makes little sense to disallow the use of a tool that students will encounter in the future.
Professor Eaton suggests focusing on all aspects of the design process, rather than solely focusing on the end product, such as the final mark of an essay.
The integration of ChatGPT in academia raises a thought-provoking conversation. The divide among faculty, the benefits, and the perceived risk all highlight the fascinating complexity of this technology’s role in academia. The benefits and issues created by such tools highlight the ethical responsibility students have to use them to enhance learning and not override understanding. Academic institutions must remain flexible and forward-thinking, adapting their educational strategies to meet the constantly evolving demands of this technology-driven world.