116 3rd St SE
Cedar Rapids, Iowa 52401
Iowa universities drill down on ChatGPT technology
Professors weigh pros and cons, set student expectations
The future is now when it comes to higher order artificial intelligence — with tools like ChatGPT emerging as the latest technology to disrupt the labor, health care, social, and educational landscapes.
Acknowledging the reality that students, workers, employers, and corporations will use AI and chat bot technology — and, in many cases, already are — Iowa’s public universities are issuing guidance, forming committees, crafting assignments and churning out research on how to use it, how to benefit from it and how to guard against its limitations and potential harms.
“Iowa students are likely to engage with these tools, and it is clear that instructors will need to consider carefully how to adapt,” according to recent guidance issued by the University of Iowa’s Office of Teaching, Learning, and Technology.
“These are new technologies, and the future is unknown; the availability and reliability of these tools are developing.”
Of course, generative artificial intelligence — machines that create new content — isn’t new. Apple’s “Siri” or Amazon’s “Alexa” for years have been answering questions and taking orders, like playing a song or setting a timer or reading the weather on command.
What it is
But ChatGPT — released in November by research company OpenAI as a “language processing tool” capable of generating text, computer code and images — “is much, much more powerful,” said Patrick Fan, UI Tippie College of Business chair, professor in business analytics and faculty director of the Tippie Analytics Cooperative.
“Basically, it's a huge model that mimics the human being,” Fan said. “But it can have much, much higher cognitive capability than a human being.”
Anyone can use ChatGPT for free by entering a prompt — like asking it to summarize a historical topic or provide both sides of an argument — and it will, within seconds, churn out hundreds of words in well-written prose.
In just the few months since its release, major tech giants like Microsoft, Google and Apple have started using it or advancing their own versions.
It’s only a matter of time before ChatGPT applications — GPT stands for Generative Pre-training Transformer — become commonplace and widespread, Fan said.
“People are going to continue to use this,” he said. “And they’re going to use this for a while.”
Pros and cons
But fears have emerged around the explosion of AI — and ChatGPT specifically — such as the ease with which students can use it to take shortcuts on writing assignments or research; its inability to soften tone in discussing sensitive subjects; its potential for rogue behavior or dangerous suggestions; the odds it will spread misinformation; and its potential for replacing human workers.
But, Fan said, benefits abound — like its ability to help students get started on an assignment, to help businesses work more efficiently and to help companies improve customer service and wait times. And while artificial intelligence might eliminate some jobs, it will create new ones, too, he said.
“I do believe that these tools will be very, very fascinating,” Fan said. “They’re going to have a huge impact on all walks of life, and we have to learn how to live with it. So my attitude is we need to embrace it. We need to know how to use it. We need to leverage it. We can’t just ignore it, otherwise, you could fall behind.”
Lingyao “Ivy” Yuan, Iowa State University assistant professor of information systems and business analytics, has made artificial intelligence and “digital humans” central to her research over the last seven years.
And she recently published an article for Harvard Business Review spelling out the case for and against them — including that digital humans can work 24/7, don’t ask for a raise and can help with sharing sensitive information.
They also can cost a lot and raise ethical questions — which the University of Northern Iowa hints at in its recent response to ChatGPT.
“A group of campus representatives are assembling to attempt an institutional response to these challenges,” according to the UNI Center for Excellence in Teaching and Learning.
Pamela Gibbs Bourjaily, associate professor of instruction at the UI Tippie College of Business, said she wants her students to learn how to leverage the AI tools.
This semester, she deployed a ChatGPT assignment in her business communication and protocol class.
“I am very interested in ChatGPT, and I wanted to get on it now — in terms of getting a sense of students and how are they already using it. Are they already using it? And what are their experiences?” Bourjaily said.
“It is something that is going to be part of their business communication, their business writing, when they are in the workforce. And so teaching students how to produce effective business communication is going to have to include how to work with AI-generated language.”
Her assignment asks students to evaluate and revise a ChatGPT-generated response to the prompt, “How do employee monitoring systems help or worsen equity issues within the workplace among employees of different genders, races and abilities?”
When Bourjaily asked ChatGPT that question, it churned out a 249-word response that — while coherent and reasonably well-written — didn’t necessarily follow the business-writing structure students had been taught, including front-loading information in a skimmable document with effective claim statements.
The goal of the assignment was to teach students how to recognize artificially generated language, address its deficiencies, make revisions and customize it for specific audiences and expectations.
“Students can either elect to use ChatGPT … or they cannot use it at all and do their own writing, or they can do a mixture of both — a hybrid — it honestly doesn't matter,” Bourjaily said, reiterating the goal of producing effective business writing. “If they can get machine learning to produce effective communication, that's great.”
But not all classes are amenable to AI-generated assignment submissions or even assistance — and Iowa’s universities recently issued guidance for professors wanting to set student expectations at their course’s outset.
In a class syllabus — per an example given by the UI Office of Teaching, Learning and Technology — instructors wanting to ban the use or artificial intelligence could list it among violations of academic integrity or include the following statement:
“Since writing, analytical, and critical thinking skills are part of the learning outcomes of this course, all writing assignments should be prepared by the student. … Therefore, AI-generated submissions are not permitted and will be treated as plagiarism.”
Iowa State issued guidance, too, suggesting optional language for professors who are OK with it on a conditional basis.
“It's allowed with appropriate attribution,” according to one of ISU’s suggestions. “It's allowed in limited instances,” according to another. Or “AI can be used to prepare for assignments by brainstorming, but students must show how it helped them reach the result.”
“It is important to be open and set expectations with students ahead of time,” said Matt Carver, ISU Center for Excellence in Learning and Teaching enterprise instructional technology senior manager.
Acknowledging some universities have banned ChatGPT altogether, Carver said students who use it with instructor permission remain responsible for submitting accurate information. One of the apps’ limitations, he said, is its database only goes through 2021 — making it unaware of current events.
Tools exist to detect artificial intelligence — like those used to detect plagiarism — but many lag behind the updated ChatGPT technology.
“It's important to remember that these emerging technologies are likely to be an imperfect solution,” according to the UI guidance, warning against widespread use of AI-detection tools by quoting University of Mississippi instructor Marc Watkins.
“We should be proactive in our response and not approach our teaching out of panic and mistrust of our students,” Watkins said.
“What message would we send our students by using AI-powered detectors to curb their suspected use of an AI writing assistant, when future employers will likely want them to have a range of AI-related skills and competencies?”
Vanessa Miller covers higher education for The Gazette.
Comments: (319) 339-3158; email@example.com