Professors discuss ChatGPT’s potential as a tool in the classroom

The+logo+for+OpenAI%2C+the+maker+of+ChatGPT%2C+appears+on+a+mobile+phone+in+New+York%2C+Tuesday%2C+Jan.+31.

AP Photo/Richard Drew

The logo for OpenAI, the maker of ChatGPT, appears on a mobile phone in New York, Tuesday, Jan. 31.

By Bella Markovitz, Staff Writer

Despite numerous headlines and stories about how students can potentially use ChatGPT to cheat in school, many Pitt professors are approaching ChatGPT with curiosity rather than fear.

OpenAI launched the chatbot in November 2022. ChatGPT, which stands for Chat Generative Pre-trained Transformer, uses machine learning to produce responses to whatever prompt the human user inputs. Students aren’t the only ones who see potential uses for ChatGPT. Some Pitt professors are already finding constructive ways to make ChatGPT useful in the classroom, with some caveats. 

Annette Vee, director of Pitt’s composition program and associate professor, currently researches the “intersections between technologies and writing.” She said many articles mentioning ChatGPT are “clickbait” and thinks that, instead of being something to fear, ChatGPT is a potentially useful tool.

“Articles are circulating, for sure, but it’s been clickbait,” Vee said. “It’s not at all the end of the college essay … It depends on what we use the essays for. And if we’re using essays for students to connect with each other, to reflect on their thoughts, to engage critically with texts, those are all things that ChatGPT doesn’t replace.”

She said her colleagues within the composition program are “intrigued, nervous, interested [and] thinking quite carefully” about how to enter this new era of AI writing.

“We have some amazing, very considerate teachers here in the composition program,” Vee said. “They are very interested in thinking about how they can approach teaching writing, either with this technology, or in the era where this technology is circulating, and how they can best support students’ choices in how they navigate these things.”

Colin Allen, a professor in the history and philosophy of science department, specializes in machine morality and computational humanities. Allen said he notes ChatGPT in his syllabus for his “Introduction to the Philosophy of Science” course this semester. 

Under the “Statement about Academic Misconduct” section in Allen’s syllabus, it reads, “Any ideas or materials taken from another source for either written or oral use must be fully acknowledged. This includes output taken from ChatGPT or other AI models.”

Seth Goldwasser, a doctoral candidate and graduate research fellow in philosophy who currently teaches the “Mental Action” course, also makes note of the AI program in his syllabus, where he dedicates a whole section to discussing ChatGPT. 

“I will not treat just any use of ChatGPT as cheating,” according to Goldwasser’s syllabus. “For instance, if you’d like to use the model to edit texts you’ve written, provide suggestions for writing prompts or as an interlocutor in writing up a dialectic, I’m happy to accept work so produced. That being said, I will treat passing off ChatGPT responses as your own work as cheating and plagiarism.”

Goldwasser said he included it in his syllabus because he sees the use of such AI as a “technological inevitability.” 

“I can’t ignore it, it’s out there,” Goldwasser said. “My most technologically literate students will know of it, right? And it won’t be very long before all of my students are aware of its existence and what it can do.”

During the first week of class this semester, Allen said he demonstrated for his students how ChatGPT works. He asked the bot a question, and then compared its answer to his own in front of the class. 

“I showed the ChatGPT’s version and discussed how the version of the argument it gave was quite similar, but the second premise of its version was less likely to be true than the corresponding premise in my version,” Allen said.

Allen thinks that students could use ChatGPT as a practical tool with which to study and engage with course material. 

“It can be useful for students to see a middle-range answer to the study and exam questions that we provide, if they are unsure where to start. It is especially useful if they think about how to improve on the answer that ChatGPT gives,” Allen said.

Though Allen allows his students to use ChatGPT, he also said there are caveats.

“The class policy is that they may use ChatGPT as long as they acknowledge [that they used it], but also that ChatGPT’s answers are unlikely to be in the A range for grades, [as it gives] generally a low B for the most basic questions,” Allen said.

According to Allen, ChatGPT often did worse when asked to analyze particular arguments or synthesize multiple themes.

 “It also sometimes would do worse because it tends to make things up that are incorrect,” Allen said.

As the teaching assistant for Allen’s class, graduate student Caitlin Mace also went over ChatGPT use during a class recitation. By inputting an example exam question, Mace showed students how ChatGPT responds to prompts. She also showed them how to ask follow-up questions to get the chatbot to expand on its answers.

“I even encouraged this as a way to get started on answering practice exam questions,” Mace said. “I think using ChatGPT in this way was helpful because students worked on how ChatGPT’s response could be improved to form a better answer to the exam question, which both shows that they already know a lot about the content and demonstrates what a good exam question would look like.”

Mace also thinks students can use ChatGPT as a “helpful first step for research,” much like Google.

“You prompt ChatGPT with a question, it gives an answer that is generally reliable and more specific than simply Googling the answer could give, and you can follow up by asking ChatGPT to clarify or say more about any part of its answer,” Mace said.