The Independent Student Newspaper of Northern Kentucky University.
Visual+made+with+DALL%C2%B7E+2

DALL·E 2

Visual made with DALL·E 2

AI is changing how instructors approach education

Open access AI tools unleash new potential for students to learn and do work. How is NKU responding?

February 27, 2023

We all know the dystopian motif where computers trump human ability and rule the world. Let us hold our horses before we escalate our fears.

ChatGPT is a buzzy artificial intelligence product by the company OpenAI that can produce arguments, develop code and solve math and science problems. With the stunning capacity the technology has to streamline tasks, it is no wonder that it has many professors discomfited about how it will impact education.

New AI tools are being launched rapidly, but ChatGPT is the one garnering the most attention and concern. The program is a Natural Language Processing model, designed to complete tasks based on prompts by the user by analyzing patterns and meanings in a vast network of internet texts. Through reinforcement and machine learning directed by human feedback, language model chatbots are trained to refine their generative processes.

ChatGPT is the first tool of its sort that is accessible to all and free to use, making it rife with potential to be used in educational settings.

NKU hosted a two-part workshop for professors to learn about the power of these technologies and how they can be leveraged in the classroom. The workshop was led by Evan Downing, lead IT client support specialist, who when speaking with The Northerner, convincingly guised his answer to the first question with a response generated by ChatGPT.

“As far as next steps for AI, it’s research and development, especially around Natural Language Processing, computer vision, machine learning, generally anything that makes AI more accurate, efficient and humanlike…” quoted Downing. “That’s what ChatGPT said, so I didn’t answer that, but the machine did.”

ChatGPT puts a world of information at our fingertips, and it’s expected to keep growing and bleeding into facets of our lives, making the integration of AI in the classroom an important step for students to begin learning how it can be interacted with effectively.

Downing emphasized ChatGPT’s ability to act as a tutor, saying that in many foundational level courses, it could be a practical teaching method that allows instructors to use class time to test students’ understanding of content through explanatory and iterative activities. The idea can be compared to the advent of calculators in a math class—for students to demonstrate understanding of the material, they show their work. For some classes, in-class writing assignments and oral presentations will be a feasible method to work around ChatGPT’s power, Downing said.

The premise behind this teaching strategy is the “flipped-classroom model,” which entails students learning and practicing content out of the class and assessing the knowledge in class.

“We don’t want students just copying and pasting information from a chatbot, and the way you get around that is not by saying ‘you’re not allowed to use it,’ but by changing the structure of the assignment to where that doesn’t make sense anymore,” Downing said.

For advanced coursework that involves critical thinking and tenable formulation of ideas, ChatGPT can be leveraged as a producer of ideas that can be scrutinized and shaped into more focused ideas.

Communications studies professor Dr. Whittney Darnell is using this approach in her classes. An assignment framework she has implemented is for students to use ChatGPT as a starting point, telling them to begin by asking the chatbox about a topic, which will often yield information imprecise to the discipline being studied.

“There might be concepts such as Albert Bandura’s Social Learning Theory. And that theory is not just used by communications people. It’s also used by people in psychology, sociology and other disciplines … ChatGPT isn’t going to capture all of that … in that very focused, critical lens that I use in my discipline,” Darnell said.

From there, students can compartmentalize the text, weed out misinformation and irrelevant information, and reinforce key ideas to arrive at a concentrated product.

Jen Cellio, associate professor and director of the writing program, has been keeping an eye out on these rapidly growing tools for quite some time now. She mentioned the concerns that could come with AI and ChatGPT, but also the positive ways it might help students if it is used in an ethical way.

“We could also think about a five paragraph essay that is a perfect example of that really formulaic writing that is like a heuristic for filling in the information right,” Cellio said. “Well, the ChatGPT and AI could be used as a similar kind of tool. To get you started to overcome writer’s block, to use it for organization and to chunk your ideas together in a piece. And so, I think it’s probably going to be important that we figure out ways to help students use the tool ethically, and with some sophistication, rather than trying to prevent it altogether.”

But ChatGPT has a tendency to “hallucinate,” a phrase used to describe when the machine produces nonsensical and blatantly incorrect information. This flaw may become less prevalent as machine learning continues to sharpen its processes, but it points to the need for students to understand its pitfalls and use it responsibly. Much like the internet—which the machine is trained to analyze and compile data from—not everything it says is true.

But for professors who don’t design assignments that encourage or mandate usage of the tool, the risk remains for students to use it unscrupulously.

Darnell thought back to a few occasions where she had a hunch that a student’s submission was not their own work but couldn’t prove it. Simple assignments like syllabi quizzes, she said, provide her with student writing samples that can be undeniably telling.

“By the time we get through seven weeks with you or 16 weeks with you, we really know your style,” Darnell said.

Despite the development of tools claiming to detect products generated by ChatGPT, these tools aren’t necessarily reliable because they must keep up with the quick pace that ChatGPT is adding updates, Downing said.

Language and policies that lay out a framework for how issues of academic dishonesty and AI tools will be handled are being discussed and tested at NKU, according to Downing, but they are in the early stages because of the nuances and evolving nature of the issues. The matter presents a complex and philosophical discussion about how to categorize and monitor this type of offense.

“We want to be specific with our language, because plagiarism is different. It’s using somebody’s written works and using them as your own. But the chatbot is generating this in real time based on your prompts so you can generate this information yourself and it’s different every time,” Downing said.

ChatGPT and a bevy of other AI-powered resources are a likely part of our future. Getting acquainted with how they work and what they can be used for will put students at an advantage. Although the tools are incredibly powerful, they are most useful when leveraged with purpose and shrewdness. 

Knowing how to write prompts that generate useful insights that can be applied to a set mission could be a valuable skill across many disciplines and a legitimate profession.

“How do you produce a prompt for it where it will give you what you’re looking for? And based on that … how do you develop secondary prompts to get it to refine the answer it gives you,” said computer science professor James Walden. “Some people are already hiring what they call prompt engineers.” 

Some instructors, like Professor of English Jonathan Cullick, are contemplating how the rise in these tools may affect occupations as a whole, raising a bigger question of how much AI can be used to perform traditional human functions.

“So it’s the challenge of a system or systems that can produce original work, so it raises a lot of concerns. I think about the future of different occupations or professions. You know, what is it? What does it mean when a computer can do more and more of what a human being has typically been needed to do,” Cullick said. 

But as of now, “AI is really bad at that last mile,” Downing said. What makes AI a possible boon to industries is the efficiency it enables to complete and enhance tasks. Pairing the power of AI with a specialized human mind puts new frontiers in reach for those who embrace it.

“That’s fine that it does this, but my training in communication is very valuable … so I want students to feel empowered and know they didn’t just skirt through college,” Darnell said. “They use those tools to be able to help them, sure, but they also got training on how to think and have something to offer their future employers.”

But the gradual emergence of AI into everyday devices and environments isn’t the first time our society has encountered revolutionary technology that shifts how people perform jobs and access information.

“I think that we do have to remember that our concerns about this new technology are certainly not the first time we’ve been concerned. I can still remember back when the personal computer came into being, and I can still remember when I started graduate school, and I was working in a writing center and the writing center brought in this new invention. It was called a mouse, a computer mouse,” Cullick said.

Cullick further explained his hopes for the development of these tools, stating how he hopes it be utilized like other technology in the classroom. At the end of the two-part workshop, Cullick filled out a survey which asked for feedback on the workshop.

“At the end of that workshop I attended, I filled out a survey and it said, ‘Do you want more workshops for faculty to learn how to use this in the classroom?’ And I said, yes, that we need because the technology is always ahead of wherever human beings are. It’s just always a step above us. I’d like to know because if the technology exists, I want to bring it into my classroom the way I bring in other technologies,” Cullick said.

*Editor’s Note— This story was originally published in the February print edition and has since been updated online for accuracy.

The Northerner • Copyright 2024 • FLEX WordPress Theme by SNOLog in