
By NAYELI CARRILLO/Staff Writer
While hearing keyboard clatter and seeing students on their phone at universities is not uncommon, now unlike years past, AI or Large Language Models are a fingertips’ reach away.
OpenAI, an organization on the forefront of making artificial general intelligence, released ChatGPT in November 2022. OpenAI was founded in 2015 as a nonprofit with the goal to build a safe and beneficial AGI. Although the organization now runs a for-profit subsidiary, it is their work that introduced and made the new technology accessible.
At the start of the fall semester, students were not only given class syllabi but also AI use policy statements. Educators could choose a class AI use policy from three guidelines options: one that encouraged AI use, one that permitted it on specific assignments and one that did not allow it.
Steve Myers, with UT Tyler’s Office of Digital Learning, wrote the option statements. He said educators had options to then further modify policy to their own classes.
“A syllabus really is a contract between the student and their faculty member,” Myers said. “So, they can put anything they want in there. Somebody might have a policy about devices, whether you can use phones or not. Whereas another faculty may not. “It was just handled as one of those other things. That you can have a policy on, or you can’t. It’s up to you.”
Myers said because most faculty members were concerned about the plagiarism that might arise with AI, the university decided to give the guidelines a try.
“They (administrators) decided because of the nature of AI and because of the nature of how it’s being used, they would provide three options for faculty to go and select which option that works in their classroom,” Myers said.
Myers said his job was to help make it easier for professors to implement AI policies.
“One of the tasks the university has put on me as an instructional designer on campus is to make recommendations to faculty, to level up their assignments to not make them so easy to just dump into an LLM and then copy and paste the resulting stuff out as the assignment,” Myers said.
Students, like sophomore computer science major Alan Mondragon, noticed changes in assignments.
Mondragon said in his major most assignments can’t be entirely completed with AI because of their complexity but that he uses ChatGPT as a study tool.
LEARNING TOOL
“It’s like a learning tool that you can ask anything, and it’ll really help you out,” Mondragon said. “I was using it the other day for math and whatnot, and it’ll explain it to you.”
However, Mondragon said it is a fine line between using AI for help and abusing it by fully relying on it.
“I feel like it’s like math. You could cheat on it, but it can end up catching up to you at some point. Like if you cheated on your multiplication tables, later with higher levels of math, it’s harder for you to really just solve problems,” Mondragon said. “In a sense it goes hand in hand with coding. Like, I can use AI to help me solve certain situations or help me understand. But if I haven’t grasped a concept by myself, then it does end up catching up to you, which I feel like later in the future makes everything much harder.”
Luz Aparicio, a sophomore IT and finance major, also has mixed feelings about using AI in class but believes AI is something that must be embraced.
“They do state to not plagiarize and to not copy and paste. But in one of my classes my professor really encourages AI use because he knows that we’re going to be using it a lot further on in our careers,” Aparicio said.
The recommendations that Myers gave professors to “level up assignments” included making them more detail oriented, reflective, specific and to get students to apply course concepts.
“For, say, a class like mine, that’s maybe more project based, that you’re creating pieces of media. Using AI in that space is not hurting, in my mind, the result because you’re not going to be able to just generate the complexity of work that I’m asking,” Myers said. “But then you go over here to a freshman, maybe composition class, where literally the point of the class is learning how to write. We probably shouldn’t be using AI so directly.”
Giselle Gaona, a UT Tyler nursing student, rarely uses ChatGPT. Most of her coursework doesn’t involve writing. When it comes to studying, she said students use it as a tool.
Gaona, Aparicio and Mondragon all acknowledged that AI raises concerns but agreed that using it is inevitable.
“It’s really hard to not use it, to be honest because even on Tik Tok filters, that’s AI” Aparicio said. “When you use Google, that’s AI itself now. So, it’s really inevitable to not use it. You got to embrace it and not be afraid.”
Myers said that with how quickly AI technology is progressing it is difficult to create policy to keep up with it. Instead, the university is continuing to use the guidelines that debuted in the fall minimizing the fears of AI within faculty.
UTILIZING AI
Myers and the Office of Digital Learning is developing workshops to get faculty members open to the idea of new AI technology and utilizing it as a teaching resource.
“We’ve been trying to minimize the fear in faculty by doing AI playgrounds over the summer. So, each week we would bring an AI tool, and not immediately think ‘how are we using this in the classroom,’ but just playing with it,” Myers said. “Putting prompts in there and having a little bit of fun to try to minimize the fear, because I think what’s happening is a lot of faculty, they just didn’t even want to bother with it. They wanted to put their head in the sand and hope it blows over, and then we can go back to normal. That’s obviously not going to happen.”
The Teaching Resource Hub being developed will serve as a blog where faculty can write about pedagogy as attending events or workshops can be difficult for instructors with a busy schedule.
“We’re trying to beef that up and to get some collaborators in there to make it a good resource for faculty moving forward to, yes, hopefully quell the fear, but also improve their teaching ability,” Myers said.
According to a university statement, UT Tyler is committed to exploring and using AI tools and encouraging the discussion of its ethical, societal and philosophical applications. With that in mind, a new senior level course called Communication Processes and Artificial Intelligence is being developed.
The undergrad section of the course would cover how AI was created, the history, the application and how it effects human communication.
“Students will develop a foundational understanding of the inner workings of AI technologies and learn to employ AI technologies for various professional and practical purposes,” says information about the upcoming course.
Myers said the language about AI use found syllabi has received positive reactions not only at UT Tyler but at other higher-education institutions.
“Everybody’s just trying to figure it out ‘what do we do? How do we move forward,’” Myers said. “So that was a pretty well-respected move to give that faculty member authority still in their own classroom to use it how they see fit.”