Chris Schulz Published

Despite Concerns, Educators See Artificial Intelligence As A Classroom Tool

Hands are poised on the keyboard of a laptop ready to write.Adobe Stock
Listen

Artificial intelligence is raising the possibility that students could cheat when writing papers. But educators and technology companies say they are ahead of the curve.

Since its launch in November, the artificial intelligence-based program ChatGPT has drawn a lot of attention for its ability to quickly generate written passages based on simple prompts. Tell it to write you a 500-word essay on “The Old Man and The Sea,” and within moments, you have a completed assignment that may have taken a student hours to write. With so much attention has come a lot of criticism and concern, especially in the realm of education.

“ChatGPT Is the Wake-Up Call Schools Need to Limit Tech in Classrooms,” reads one headline in Time magazine. 

But educators and academic organizations weren’t caught flat-footed by the new technology.

Zack Bennett is a distinguished machine learning scientist for Turnitin, a software that helps to detect plagiarism in students’ work. In early April, the company launched an update of their product to help identify AI generated writing, something that Bennett says can be done due to patterns. 

“What it comes down to is that the language models that generate text tend to produce very average or probable words when they’re generating the text, whereas humans don’t really do that,” Bennett said. “They tend to do things in more surprising fashion, they use unexpected words or new ideas appear in their papers. We were able to hone in on that and tune a detector to distinguish between what a student writes versus what an AI generates.”

Even the creators of ChatGPT, OpenAI, admit the program has flaws. It is prone to certain biases from the data it’s trained on and has regularly been observed to make up information to fit a given prompt. Bennett says that as technology changes, everyone needs to adapt with it.

“I think also there’s a role for parents in all this to talk to their children, find out what they know about the tool. Start a conversation about AI literacy, what it means to have these tools available, because there will be temptations to use these in ways that are not optimal,” he said. “It’s important to remember they are tools. You want to use them mindfully, you want to use them to not replace you but augment what it is that you’re doing.”

Annie Chechitelli is the chief product officer for Turnitin. She agrees that ChatGPT and other similar AI writing systems aren’t just a threat to classrooms, but a tool.

“We’re seeing some really innovative things teachers are doing to incorporate these tools into student learning through experimentation, and I think is that going to continue,” she said. “We hope it continues as we have more conversations with educators on ways that we can help that or maybe devise new opportunities for tools that help students write. With that said, when we do talk to teachers, their request is just that there’s some simple measure to help them say, ‘Hey, there might be some AI writing here.’”

Educators at every level recognize the potential for abuse that AI writing systems present, but as Chechitelli says, they are also creative enough to realize that they can be harnessed as tools.

Josh Holley is the technology coordinator for Jackson County schools. He is married to an employee of West Virginia Public Broadcasting. 

Holley said his district has plagiarism detection software in place for teachers to use but hasn’t yet heard of an instance of an AI being used on an assignment.

“I don’t think that the students have really tried to use it because the teachers, right when we first got (the software), were like, ‘Hey, look what this can do.’ And the kids are, like, ‘Oh, okay, better not try this.’” he said. “We wanted to get ahead of the game.”

What Holley has seen are educators starting to integrate the new technology into their classrooms.

“One of my fellow technology integration specialists has really dived into using ChatGPT for creating interactive presentations. You type in the topic that you want to teach about, and it  creates its own, sort of like a PowerPoint, that’s what it looks like,” Holley said. “It’s more interactive, and the kids get to do more stuff with it. It just generates all the information for the teacher.”

Despite the promise of AI writing if used correctly, the potential for abuse remains. Applications like Turnitin don’t determine student misconduct. That job is left to someone like Paul Heddings, director of academic integrity at West Virginia University. His office is tasked with investigating and adjudicating allegations of potential academic misconduct for the entire WVU system. Despite its novelty, Heddings says AI is not outside of his office’s expectations.

“That’s not abnormal for academic integrity. Generally, if you think about artificial intelligence as a continuum itself, it’s something that’s been around for a very long time in various places in our lives,” he said. “Academic dishonesty is one of those things that’s ever growing and we have to be evolving with the times.” 

Heddings sees academic integrity as a question of fairness for other students, but also as another opportunity to teach.

“Rather than just focusing on bad behavior, I place a lot of weight on trying to position the student for success in the future,” he said. “Many of the plagiarism cases we see are instances where students are not confident writers, or maybe they don’t understand the distinction between patch writing, and paraphrasing, and straight plagiarism. We really have an opportunity to help students learn and grow, because college is a time of profound growth, and it’s not only growth within the classroom.”

The AI writing landscape is growing and changing quickly, as companies including tech giants like Meta and Alphabet come out with their own platforms. 

But Heddings and others are confident that the world of education is ready for the change. WVU has already put together an artificial intelligence taskforce, with Heddings as a co-chair.

“Even though it’s easy to become kind of sensationalized about it or be a doomsdayer, I think our faculty have been very well grounded in the understanding that this is something to be aware of from an academic integrity perspective, but also a potential tool for the future,” Heddings said. 

“There’s not a cookie cutter approach to artificial intelligence, but we need to give some guidelines to our faculty to help them better understand what tools and resources my office has, and others on campus have, and then how best we can integrate ChatGPT and other models into our curriculum to really harness the power that they have and help prepare our students even better for the future.”