Education & E-Learning

College Students, Professors Make Their Own Rules for AI. They don’t always agree

“It’s not fair to them,” Cryer said.

More than three years after ChatGPT started, generative AI has become part of everyday life, and professors and students are still trying to figure out how or if they should use it, especially in the humanities.

A recent survey suggests that more students are getting involved: According to a survey conducted by Inside Higher Education and Generation Lab conducted last July, about 85% of undergraduate students were using AI for coursework, including brainstorming ideas, drafting papers and studying for exams. About 19% of students also reported using AI to write full essays.

More than half of students who use AI in their studies have mixed feelings about it, reporting that it sometimes helps them but also makes them think less.

Aysa Tarana, a recent college graduate, was in her first year at the University of Minnesota Twin Cities when ChatGPT was released. He says he started using the chatbot to do small tasks, such as suggesting topics to be researched.

But Tarana says she eventually stopped using AI because it made her feel like “I was taking my thinking out, and that felt really weird.”

That’s exactly what Cryer is worried about.

After spending a sabbatical to develop AI, he reached his conclusion: Cryer believes that teachers should use AI tools as little as possible in their teaching.

“One of the main purposes of these tools seems to be to prevent you from thinking too hard,” he says.

Cryer says she now spends more time persuading her students about the importance of putting in the work to become better writers. He says he explains to them that the purpose of their education is the process, not the product – because society doesn’t need a lot of college books. “What we need is for students to go through the process of writing research papers to become better thinkers, to be able to put together a strong argument, to be able to distinguish between a good source and a bad source,” said Cryer.

And if students rely on AI to do their work for them, Cryer says, it could end up cheating them out of the education they signed up for.

A professor who sees the value of productive AI

In Charlotte, NC, Leslie Clement says she’s coming to view generative AI as a powerful collaborator that can improve student learning.

“We encourage [students] using it because we know they’re going to use it, but using it the right way,” said Clement, a professor of English, Spanish and African Studies at Black Johnson C. Smith University.

Clement says it allows students to use AI to create outlines for their papers, get feedback on ideas and compare different sources of information.

Clement also created a course called “African Diaspora and AI” that examines how AI affects people of African descent around the world, including the dangerous mining of cobalt, an important component of AI technology, in the Democratic Republic of Congo. The course also covers the future benefits of AI, as well as the contributions of Black researchers and scientists.

“We are looking at Afrofuturism, how students can use these tools to rethink their future,” said Clement.

He says his goal has always been to encourage critical, honest and inclusive thinking — and he wants his students to apply those skills to their use of AI tools.

“I want students not only to use the tools effectively but also to investigate them,” said Clement.

An AI learning companion

A few hours northeast of Clement, in Durham, NC, pre-med student Anjali Tatini found her way to good use of AI. Tatini is double majoring in global health and neuroscience and says AI tools have helped her better understand some of the more complex subjects she’s been studying.

Take last semester, when Tatini, a 19-year-old sophomore at Duke University, said she was confused by some concepts in a biology class. He turned to Gemini — Google’s AI chatbot — for help.

“I’d say, ‘This is a concept—can you explain what it means?'” Tatini recalls. “And it would just answer me. And if it was too loud, I could ask it to be quieter, which helped a lot.”

In some classes, such as chemistry, Tatini says he has used AI to create practice problems to help him prepare for exams; in a marketing class, use it to brainstorm ideas; in mathematics, he has used it to help him create lines of code for data analysis.

It helps to have a tutor when needed, Tatini says, because she can’t always meet her professors in person.

“I have jobs, I have other classes, I have clubs, I don’t have time to work all these working hours,” he said. “So it’s nice to have something that comes in my time, able to answer me in the way that a human being can.”

Tatini draws a line for him to write to the AI. He says he will use these tools to help express and organize his ideas, but the actual writing is his.

“When I put something out, I want it to be something that I’m proud to call mine. So I would never use AI to write something because it wouldn’t sound like me.”

“What you produce is like a fingerprint on the earth”

Nearby, in Chapel Hill, Hannah Elder, a 21-year-old junior at the University of North Carolina, is also proud to have her writing assignments.

He says: “I’m a big believer in developing your own thoughts and being able to express them.

The senior is a pre-law student, and takes a variety of courses, including public policy and philosophy classes. He says he uses generative AI to assess his work and evaluate it against course rubrics.

But the Elder says he will never use it to write or make ideas for him.

Learning how to form her own ideas and beliefs and communicate them through writing has been one of the most important parts of her college experience, Elder said. He worries that if students rely on AI to do that for them, they won’t learn to think for themselves.

“I still use notebook paper [for] all my notes, because I strongly believe in what you write down and what you produce is like a finger in the world. And I think in a way that’s getting lost,” Elder said.

Still, the elder doesn’t think the solution is to block AI altogether.

“We cannot deny that it will be a part of it [the college experience],” he said.

He wants teachers to integrate AI education into the curriculum so that students learn to see the line between beneficial and harmful uses.

He says: “If teachers use it properly for academics, I think it won’t be seen as a cheat code and like, ‘Here’s the truth of this, here’s how I can use it properly, here’s how it helped me.'”

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button