Key Takeaways
- A Northeastern University senior requested a tuition refund after discovering her professor used AI for lecture notes.
- The professor acknowledged using multiple AI tools and stressed the importance of transparency.
- The university ultimately denied the student’s refund request.
- The incident highlights a shift, with students now scrutinizing professors’ AI use, not just educators worrying about student AI use.
- Many universities are grappling with AI policies, often requiring disclosure and verification of AI-generated content.
A student at Northeastern University recently brought attention to the evolving role of artificial intelligence in higher education, but not in the way you might expect. Ella Stapleton, a senior, filed a formal complaint and asked for a tuition refund after she suspected her business professor was using AI to generate course notes.
Stapleton’s suspicions grew when she noticed telltale signs in the lecture materials. These included an accidental “ChatGPT” citation, recurring typos often seen in machine-generated text, and even images with anatomical oddities like extra limbs, according to Yahoo News, which cited The New York Times. “He’s telling us not to use it, and then he’s using it himself,” Stapleton remarked in an interview with The New York Times.
She lodged a complaint with Northeastern’s business school, focusing on the professor’s undisclosed AI use and other concerns about his teaching, demanding a refund of over $8,000 for the course. However, after a series of meetings, Northeastern University decided to reject her claim.
The professor, Rick Arrowood, confirmed to The New York Times that he had used several AI tools, including ChatGPT, Perplexity AI, and an AI presentation generator called Gamma. He admitted, “In hindsight…I wish I would have looked at it more closely,” adding his belief that professors should be thoughtful and transparent about integrating AI.
This situation flips the script on common AI concerns in academia. Initially, professors worried about students using AI to cheat. Now, students are increasingly voicing discontent when they believe educators are over-relying on AI, sometimes complaining on platforms like Rate My Professors.
Students argue that paying hefty tuition fees is for human expertise and instruction, not for content generated by AI tools they could access themselves for free. This sentiment underscores a growing debate about the value and ethics of AI in the classroom.
Northeastern University’s own AI policy mandates that faculty and students “provide appropriate attribution when using an AI System to generate content” and “Regularly check the AI System’s output for accuracy and appropriateness.” Renata Nyul, Vice President for Communications at Northeastern, told Fortune that the university “embraces the use of artificial intelligence to enhance all aspects of its teaching, research, and operations” and provides resources for its appropriate use.
While many schools are still developing their AI guidelines, with some imposing restrictions or outright bans, the conversation around AI in education is clearly expanding, now encompassing how educators, not just students, use this powerful technology.