Skip to Content, Navigation, or Footer.
Thursday, Dec. 26, 2024
The Observer

steven-van-elk-3JUlnKxtxoY-unsplash.jpg

University classifies Grammarly as generative AI

In August, the Office of Academic Standards issued a policy update to the Undergraduate Academic Code of Honor that classified AI-powered editing tools, including Grammarly, as generative artificial intelligence.

According to the update, “This means that if your instructor prohibits the use of gen AI on an assignment or in a class, this prohibition includes the use of editing tools, unless explicitly stated otherwise.”

The clarification was first communicated to faculty on Aug. 22. However, according to the director of the Office of Academic Standards, Ardea Russo, no mass communication has been sent to students informing them of the policy.

“It needs to be communicated to students in a more official way,” she added.

Following an interview, Russo sent an email to The Observer stating an email would be sent to students in coordination with the Division of Student Affairs.

The policy, according to Russo, was formulated after a spate of alleged honor code violations last finals season. Ten cases related to the use of Grammarly were handled in May 2024 alone, though they were all resolved as “educational outcomes,” meaning no disciplinary action was taken. However, students were cautioned to exercise care in future work and check with instructors before using such tools.

Russo said professors raised concerns when reviewing student writing that did not match the style of their earlier work. Editing tool use is often suspected when writing seems “bland,” “formulaic,” “robotic” or “devoid of personality,” Russo said.

“The student would insist that it was their work and that they didn't use AI, but then it would come up later in the conversation that they had run their paper through Grammarly or through a similar tool, and that's what was catching the professor's attention,” Russo said.

Russo strongly encourages students to ask instructors questions about specific use-cases for AI-powered editing tools and also recommends professors be specific on what features are and are not acceptable for their courses.

Part of the complexity around formulating a policy for Grammarly and other AI-powered editing tools is the wide range of features the tools offer. Grammarly, in addition to offering advanced spelling and grammar checking features, now also includes “full-paragraph writing,” “brainstorming” and “sentence rewriter” tools, according to their website.

University Writing Program professors Damian Zurro and Nathaniel Myers worry the policy update does not sufficiently capture this nuance. 

“I’m concerned that this policy is starting to paint a bright line that puts Grammarly on the wrong side of that line,” Zurro said. 

Myers and Zurro both raised concerns about newer, more generative features Grammarly introduced as Chat-GPT and similar tools became available, but they said Grammarly's more traditional features can have valuable use-cases. 

Specifically, Zurro said Grammarly can help create access for inexperienced writers to write in professional rhetorical contexts, such as formal research papers.

“I have a concern that students are going to be producing papers that faculty will now go back to spending a lot of time penalizing students for not conforming to certain grammar standards,” Zurro said. “Grammarly can mitigate that and allow the student to converse with their professor on a higher level of ideas and content of the paper.”

The University Writing Program, which comprises both the Writing Center and the Writing and Rhetoric courses, has provided input on Notre Dame’s approach to AI. Myers said the program has spoken to faculty and created resources for them, yet he wishes the program could do more. 

“It would help the University for us to have more outreach,” Myers said.

He specifically suggested the addition of a “writing across the curriculum” [WAC] faculty member whose sole focus would be to work with other programs and departments about writing assessment, assignment design and integration of AI.

“We do not currently have [a WAC faculty member] … but I do genuinely think that is something we wish we could do more of,” Myers said.

Zurro also said there is more to be done.

“It feels like everything has changed, and nothing has changed,” Zurro said.

“Everything has changed in that now, since Chat-GPT emerged, every single class has an AI policy that you can find in the syllabus,” Zurro said. “But nothing has changed in the sense that professors are largely, and there are some notable exceptions, but largely lumping [all AI tools] together … and essentially not doing a lot to change their assignments.” 

Zurro said he hopes to see a move beyond blanket-bans on AI use.

“It ends up just becoming this kind of prohibition that’s not nuanced, that doesn’t show any curiosity,” Zurro said.

However, Myers and Zurro did point to areas where they believed the policy accomplished important goals.

“I think the general intention of [the policy] is simply to protect students,” Myers said. 

Zurro pointed out the policy’s mention of the importance of proper attribution when using AI as a promising sign.

“My hope is that we get to a place where transparency and open citation of all resources just become the guiding principle,” Zurro said. 

Yet, some professors say even more traditional Grammarly features, available years before the new boom in AI technology, go too far.

Patrick Griffin is the Madden-Hannebery Family professor of history and teaches a university seminar called “The Irish in New York.” He worries using Grammarly allows students to “go on autopilot when it comes to writing.”

“It’s very important, I think, that students struggle a little bit,” Griffin said. “If you take the struggle away, the student ultimately learns nothing.”

Griffin also said he does not want to constantly police students.

“Trust has to be at the heart of any kind of intellectual community,” Griffin said.

Gerard Powers, a professor of the practice at the Kroc Institute for International Peace Studies, who teaches a university seminar called “Catholics, Conflict and Peace,” also takes a hard line against use of generative AI in his classes, including the use of AI-powered editing tools.

“If Grammarly is editing it for you, then that’s a misuse of AI. It’s plagiarism, period,” Powers said.

Powers also emphasized the importance of not relying on technology for writing skills.

“If you don’t learn the difference between good and poor grammar … then you’re not going to be a good writer. You just can’t. You’ve got to learn it yourself,” Powers said.

He also noted that AI-powered editing tools can make errors that students might not notice.

“You can’t trust the algorithms to get it right,” Powers said. “I just think there’s no substitute for doing work from start to finish. That includes copy editing.”