Ella Stapleton expected a premium education at Northeastern University—one that would justify the hefty ₹6.8 lakh ($8,000) she paid in tuition. What she didn’t anticipate was discovering her professor using ChatGPT to craft course content, even as students were discouraged from doing the same. What followed was a formal complaint, a digital paper trail, and a sharp debate about AI in academia.
According to The New York Times, the controversy began when Stapleton spotted several glaring red flags in the lecture materials: a suspicious “ChatGPT” citation tucked into the bibliography, numerous typos, and even bizarre AI-generated images where human figures had extra limbs. Her gut feeling screamed something was off. A quick message to a classmate confirmed the suspicion.
“Did you see the notes he put on Canvas? He made it with ChatGPT,” Stapleton texted. The stunned reply came instantly: “OMG Stop. What the hell?”
Professor vs. Policy
The professor in question, Rick Arrowood, later admitted to using a trio of AI tools—ChatGPT, the Perplexity AI search engine, and Gamma, an AI-based presentation maker—to prepare course materials. While not illegal, this use of AI triggered questions of transparency and academic integrity, particularly when the professor had discouraged students from using similar tools for their own assignments.
“He's telling us not to use it, and then he's using it himself,” Stapleton pointed out, branding the hypocrisy as unacceptable in a university of Northeastern’s standing.
The university’s AI policy is clear: any faculty member or student using AI-generated content must properly attribute its use, especially when it's part of a scholarly submission. The lack of such attribution, coupled with what Stapleton saw as subpar and automated instruction, led her to demand a full tuition refund.
Complaint Dismissed, Lesson Still Echoes
After rounds of meetings, Northeastern University rejected Stapleton’s refund request. Professor Arrowood expressed regret, admitting, “In hindsight… I wish I would have looked at it more closely. If my experience can be something people can learn from, then OK, that’s my happy spot.”
Still, the case has opened up a broader conversation: where should the line be drawn when it comes to educators using AI tools in the classroom?
The Ironic Twist of AI Adoption
ChatGPT, launched in late 2022, rapidly became a household name—especially among students who embraced it for everything from essays to study guides. Ironically, as universities raced to restrict or regulate student use of AI, educators have been slower to publicly navigate their own ethical boundaries.
This incident at Northeastern reflects a new dilemma in the digital age: if AI can empower students and educators alike, can it also redefine the very value of a college education? For Ella Stapleton, the answer was crystal clear—and cost exactly $8,000.
According to The New York Times, the controversy began when Stapleton spotted several glaring red flags in the lecture materials: a suspicious “ChatGPT” citation tucked into the bibliography, numerous typos, and even bizarre AI-generated images where human figures had extra limbs. Her gut feeling screamed something was off. A quick message to a classmate confirmed the suspicion.
“Did you see the notes he put on Canvas? He made it with ChatGPT,” Stapleton texted. The stunned reply came instantly: “OMG Stop. What the hell?”
Professor vs. Policy
The professor in question, Rick Arrowood, later admitted to using a trio of AI tools—ChatGPT, the Perplexity AI search engine, and Gamma, an AI-based presentation maker—to prepare course materials. While not illegal, this use of AI triggered questions of transparency and academic integrity, particularly when the professor had discouraged students from using similar tools for their own assignments.
“He's telling us not to use it, and then he's using it himself,” Stapleton pointed out, branding the hypocrisy as unacceptable in a university of Northeastern’s standing.
The university’s AI policy is clear: any faculty member or student using AI-generated content must properly attribute its use, especially when it's part of a scholarly submission. The lack of such attribution, coupled with what Stapleton saw as subpar and automated instruction, led her to demand a full tuition refund.
Complaint Dismissed, Lesson Still Echoes
After rounds of meetings, Northeastern University rejected Stapleton’s refund request. Professor Arrowood expressed regret, admitting, “In hindsight… I wish I would have looked at it more closely. If my experience can be something people can learn from, then OK, that’s my happy spot.”
Still, the case has opened up a broader conversation: where should the line be drawn when it comes to educators using AI tools in the classroom?
The Ironic Twist of AI Adoption
ChatGPT, launched in late 2022, rapidly became a household name—especially among students who embraced it for everything from essays to study guides. Ironically, as universities raced to restrict or regulate student use of AI, educators have been slower to publicly navigate their own ethical boundaries.
This incident at Northeastern reflects a new dilemma in the digital age: if AI can empower students and educators alike, can it also redefine the very value of a college education? For Ella Stapleton, the answer was crystal clear—and cost exactly $8,000.
You may also like
Three killed in Bicester Motion fire named as tributes pour in for 'heroes'
Fresh H-1B row begins as MAGA freaks out over 120,000 visa approvals for 2026: 'They should go home'
What Man Utd boss was spotted doing at Chelsea shows little has changed at Old Trafford
Pakistan PM admits Indian strikes hit Nur Khan airbase, BJP's Malviya shares video
Keir Starmer leaves door open to changing winter fuel cuts after months of backlash