AI has made some things in life like solving math problems easier. However, when a student is only using AI to do the work for them, the line between learning information and blatantly cheating becomes more clear.
Students are aware of this toxic relationship with AI, and a report conducted by Harvard that examined teen and young adult perspectives on generative AI showed that 41% of students see it creating both positive and negative impacts on their lives.
Since COVID-19, we have been seeing the toxic side of our relationship with AI. I will fully admit that during quarantine I used generative AI with Google to look for an answer or two. I believe that this overreliance on technology is only getting worse with innovations through AI. Students who take online classes at Fresno City College have significantly better ways to cheat and look for the answer more quickly than they ever did during quarantine.
Some students are turning to AI for the answers to everything they do. Will AI be able to keep up with the complexities of certain human job fields? FCC student and architecture major Sam Griffin, echoes his concerns when asked if there should be regulations on the use of AI in certain fields of work.
“Very important jobs, where you have hundreds of lives on the line if the building doesn’t burn correctly because of a fire, for example, I wouldn’t want it to be in charge of that. Because at least humans can get it right. And maybe it’ll be better with AI? But until we know that, we don’t know that,” said Griffin.
Griffin also believes that the consulting side of the architecture industry could benefit from AI and streamline certain processes.
For now it seems that AI cannot comprehend the human element or work ethic that is prevalent throughout many industries. And more generally speaking how can AI identify humans and their personal lived experiences?
FCC Professor of English Michelle Patton says that while AI can grammatically get the right answers, she also believes that AI takes away human lived emotions and experiences, specifically when it comes to writing.
“It’s missing all that personality, all that unique perspective that my students bring. So the grammar and all that stuff we can learn, but that other thing is irreplaceable. You have to bring yourself to the table,” said Patton.
Patton said that there is a discussion on campus between instructors on the use of AI. Some are split, others like Patton think AI could cheat students out of an education.
“I think one of the points of disagreement is none of us want students to be accused falsely of using AI, so we want to make sure we’re not doing that. But then, on the other hand, we don’t want to cheat our students of a real education, which would include writing and critical thinking on your own,” Patton said.
Another FCC instructor, Professor Michael Ragsdale, has an interesting way of teaching and learning about the negative and positive aspects of AI. He will demonstrate the use of AI with his class of students.
“Instead of writing a five-page paper on this, have Chat GPT do that and then read and give some sort of analysis of Chat GPT work. We can’t run and be afraid of AI, we have to use it,” Ragsdale said.