November 7, 2024

AI inspired illustration of a WSU Vancouver student created by VanCougar layout intern Kevin Lennon.

AI Challenges Academic Integrity at WSU Vancouver

This story was originally published in Vol. 33, Issue 8 (April 2023)

Artificial Intelligence – not just in the form of ChatGPT – has raised concerns at Washington State University as it poses an ethical quandary of how faculty and professors should approach its use: whether AI has a role to play as a tool for students or whether it should be banned from academic environments entirely as outright cheating. AI is the newest threat to academic integrity.

And it all starts with homework.

“There have been some cases where it’s like, is this AI or is this someone just going way above and beyond?” said Paul Bonamy, professor of computer science. “So, how we’re going to deal with that is definitely a thing we’re looking into. And I think a lot of it will be … figuring out obvious identifiers. … But also, I think some of it will just be structuring assignments that are not harder for students to do but that are harder for machines to do.”

This is something Bonamy is looking into, although he has yet to detect submissions of AI-generated text in his classes. While it is likely to become more common with the development of such advanced AI, suspected cases must be thoroughly investigated and not automatically assumed to be plagiarism, as academic integrity complaints are very serious. Bonamy hopes to prevent such situations in the first place by adjusting his approach to coursework.

His goal is to “trick” the AI so that it will not recognize assignment prompts or provide quality information while maintaining the difficulty for students. An example would be an assignment to implement a very standard data structure or search algorithm but under a different name. It would be straightforward for the student if they understood the assignment but AI would not recognize the new name, preventing it from extensively assisting the student. Bonamy added that there are other possible methods to explore, such as adjusting how grading works.

“If your tests are super important or a super big component of your grade in your course, then if you can figure out how to cheat on a test, you get all those points for free,” he said.

ChatGPT in particular is an AI language model that generates text that could be mistaken as being written by a human. Since its release last November, it has exploded in popularity among many students of all different ages and education levels for its ease of use in generating essay writing, as well as answering general inquiries and basic research for a variety of topics. In January 2023 the platform had 100 million monthly active users, setting a record for the fastest growing platform in the world according to Reuters (using data gathered by Similarweb); beating the likes of TikTok, Facebook, YouTube and WhatsApp.

Like Bonamy, Computer Science Department Coordinator Scott Wallace has also not detected the direct use of AI by his students but said it is very likely to be happening in some courses and assignments. Wallace believes that this is just the latest evolution in a long list of methods for outsourcing thinking, and another iteration of something that has been a problem for a long time in the field of computer science.

“The internet provides a quick answer to many common questions —and while leveraging the internet can help clarify confusion on some issues and thus benefit learning — it can also undermine the work involved in developing critical thinking and the mental models needed to understand how things work in computer science or other fields,” said Wallace.

He added that there is no simple rule to ensure students are benefiting from the new technology without undermining their own learning. Faculty are having ongoing discussions of how to evaluate the knowledge and skills of the students, and how to best maximize their critical thinking and understanding of the material.

“A simple policy of ‘don’t use the internet at all in this class’ is unlikely to lead to an optimal learning environment,” Wallace said. “At the end of the day, I think that ChatGPT’s availability, like the availability of services that promise to help out with homework for a fee, puts additional responsibility on students to determine what is appropriate, ethical and in the best interests of their learning … Here again, there’s no one size fits all solution; this has been, and will remain, a constantly shifting landscape that where any approach needs refinement and revision.”

Understanding the nuance of the uses of AI and the tools that are built on it will be critical to fostering a comfortable and non-hostile learning environment for students. It is also important to ensure that they are not blatantly abusing these tools and cheating, as it not only hurts the students, but also the reputation of the school and the value of their degrees.

Computer science isn’t the only field of academics being affected by the evolution of technology. Math department professor Paul Krouss explained how these tools and their influence in class is only the next step in what is already being implemented in the classroom.

“There have actually been a lot of algebra systems and solvers out there, so we may have already seen something similar,” Krouss said. “For instance, there is an app called Photo Math where you can take a picture of the problem and then it can solve it. This can be helpful but also students can rely on it too much. So this is something we have already been thinking about in the math world.”

Krouss doesn’t feel that the rise of these smart technologies will threaten the classroom and that many resources are already available, including even better tools compared to the free ones easily at hand in the math department. For theories, Krouss mentioned how fellow math department professor Rocío Sotomayor asked ChatGPT to do a proof that is often assigned in classes.

“ChatGPT doesn’t do a lot of logic,” said Krouss. “It actually missed a crucial part of the proof, though it did look well written. I think it’s because it pulls from multiple sources from the web, and it doesn’t know the logic. It actually would have done better if it took from one source. From what I know — it’s just looking at patterns and writing, trying to mimic them.”

In the English department, professor and Director of Composition and Writing Assessment Wendy Olson said she sees potential for the program to be used as a tool in her classrooms rather than something to be feared or banned.

“We did our own playing around with it to see what it generates, and what I’ve recognized is that yes, it can answer a basic question like a rhetorical assignment but when you ask it for more specific details, it’s still very surface level. It isn’t doing any critical thinking, just showing you a very basic answer,” said Olson. “It can build a little bit of a claim, it can connect some evidence, but the analysis is barely there and sometimes there is a gap between the claim and evidence it’s providing.”

WSU Vancouver professor Scott Wallace shares his thoughts on ChatGPT within the academic system and its impact on students. (Josalyn Ortiz/The VanCougar)

Olson also noted pitfalls of ChatGPT when it comes to creating pieces of work for review and how mimicking patterns and styles isn’t enough, revealing its inability to write naturally or remove biases.

“It will mimic academic writing on a surface level,” she said. “You get the style and syntax, but sometimes the syntax is just too perfect so it doesn’t look like a person has written it because there’s no error. It is also not looking out for biases — in fact, it includes biases because of how it’s generated. It’s not always correct. So it might sound really nice, but the actual content is not going to work.”

With many professors having different approaches regarding these unprecedented technologies in the student learning environment, clarity comes from a central administrative lead.

According to Vice Chancellor of Academic Affairs Renny Christopher, there will likely not be a systemwide policy implemented specifically for ChatGPT and AI technology beyond the current academic integrity standards of the school. Christopher believes in a more nuanced approach to the use of this technology, and that there is a distinction to be made between simply submitting AI-generated text and the use of AI as a tool to assist students.

“For me, it’s important to remember that ChatGPT is a tool, and tools can be used or misused,” said Christopher. “Any use of ChatGPT in which someone, student or faculty, asks ChatGPT to write an assignment or an article for them, and then turns it in claiming that it was their own work, would be committing plagiarism.”

Christopher compares the discourse surrounding ChatGPT to the initial suspicion around the use of Wikipedia. Christopher recalls how faculty banned students from using Wikipedia in any way. However, like Wikipedia, ChatGPT can be a handy tool, so long as students are not plagiarizing information from either machine.

Forbidding the technology would have been a hindrance for students and academics alike. Instead of being abused for papers, it has served as an invaluable tool for billions of people across the world to learn and gain basic knowledge about almost anyone or anything they can think of. Christopher added that this is not to say that these tools are perfect or that they can be relied on as much as Wikipedia for basic knowledge, as they can often give wrong information. There is also an ethical and moral dilemma for the developers of ChatGPT and similar technologies to consider as there are some concerns regarding the text it learns from the internet, as well as the people maintaining and moderating the technology.

“People have asked it, for example, for a biography of a scientist who never actually existed, and ChatGPT provided an entire biography of this fictional person,” said Christopher. “There are many such examples that people have reported regarding ChatGPT telling lies or making things up. There’s also a much greater danger with ChatGPT, which is that it is an ‘open text’ algorithm, which means it gets its information from the web, and therefore it has access to all the racist, sexist horrors that are out there on the web. I recently learned that there isn’t any machine learning involved in guarding against this; there are actually low-paid workers in Kenya who are constantly monitoring ChatGPT.”

ChatGPT has many uses as a tool as it can assist in research and provide templates for a variety of different writing assignments from research papers, basic programming, creative writing, analytical papers and news columns. Although it was not, this very article could have been written by ChatGPT. ChatGPT’s free research preview cannot produce similar articles due to its distinct writing style.

It is important to understand that ChatGPT and similar tools have the potential to aid in the learning process and improve the efficiency of students and academics in their research, and equally just as important to understand the concerns regarding academic integrity and the credibility of the work being produced by students.

The faculty at WSU Vancouver have a difficult task ahead in evaluating AI models and their usage in classes. Academic integrity standards must be maintained to preserve the institution’s credibility and the value of degrees, but this must also be balanced with the inevitable use of ChatGPT and other AI technologies, as they are certainly not going anywhere and bound to get exponentially smarter and more popular. Course-specific policies and guidelines around the technology will be necessary to ensure that students feel safe and comfortable taking courses without fear of being accused of academic dishonesty or plagiarism unfairly, while also making sure that they understand and engage with course material and develop critical thinking skills.


Sean Juego, a staff writer of The VanCougar, contributed to this report.

Leave a Reply

Your email address will not be published. Required fields are marked *