AI Overuse Alarms East Lansing Teachers, Prompts Policy Overhaul
Students at East Lansing High School are using artificial intelligence so much for assignments that teachers worry their classes aren’t learning the subjects, let alone how to write and think critically.
AI use has grown alarmingly, especially in English classes, teachers and students say. There, essays are often riddled with abnormalities such as overused words, excessive use of dashes or fabricated sources, which teachers have learned to catch.
Timothy Akers, an AP language and composition teacher at ELHS, said he’s seen an alarming increase in the use of AI this year. He said on one routine essay assignment this year he found about half the essays were fully AI generated.
AI platforms like ChatGPT can be easily accessed online for free. There, students can copy a prompt into a search box and have an entire essay generated within seconds. Students can then give further prompts to the AI program to have “their essay” further refined.
Students can also paste math equations or science questions into AI platforms. While teachers can sometimes identify essay or short answer responses written by AI, it can be harder to tell if a student is using AI to cheat in subjects like math and science that often have more uniform solutions.
Akers worries that reliance on AI programs will negatively impact critical thinking in students, causing them to struggle academically in the future.
“It hinders their ability to engage deeply with topics,” Akers said. “Problem solving and analytical thinking go out the window.”
One of Akers’ students, Julia Valla, agrees. She said students who use AI “will suffer in the long term because they won’t know how to write or complete tasks on their own, which will hurt them in college and the workforce.”
For years East Lansing has embraced technology in classrooms. According to Christian Palasty, the district’s director of technology and media services, $84,000 will be put toward Chromebooks, a type of laptop, in the 2025-2026 school year.
But Chromebooks also allow students to access large masses of information, more and more of which is being taken from AI.
Due to the widespread use of AI on the essay assignment, Akers decided not to give grades, but rather simply mark them for completion because he didn’t know how much the students wrote themselves.
“I have kids who genuinely want to become good writers,” Akers said. “How can I justify giving them a grade worse than a kid who generated [their essay] on a computer? It’s not fair.”
His student, Valla, said she wrote her essays without AI and felt disappointed about not getting graded feedback for her work.
“I worked hard on the essays and I cared about how I did and ways to improve. The whole situation was frustrating,” Valla said.
Akers devised a method to determine which students were cheating with AI, using information from the high school’s information technology department and browser extensions.
While that produced evidence to prove AI was improperly used, the district had not developed consequences to punish students who cheated with AI.
“We’ve been slow and the technology develops quickly,” Akers said. “I caught these kids dead to rights, and they didn’t know what penalties to apply.”
To curb such misuse of AI, the school began drafting a plan for the 2025-26 school year. District Media Specialist Kathy Kowalski proposed an AI workgroup for teachers’ professional development sessions, which was approved by administration.
“Our main focus was to come up with some commonalities where we — as in all faculty and students — could talk about AI and be on the same page,” Kowalski said.
The workgroup acknowledged that not all AI use is harmful to students, and integrating AI into some assignments can even be helpful – as knowing how to use the new technology can be important in the professional world.
As a result, members of the group, such as library paraprofessional and Michigan Virtual coordinator Katelyn Smith, proposed alterations to the school’s Acceptable Use Policy and syllabus language to address AI. The group also devised a “key” teachers can use to determine how much AI use is acceptable. These proposed changes are still awaiting approval.
“That system was presented to the staff at large for feedback and is undergoing revision at an administrative level,” Smith said. “Then the teachers in the group started working on adapting current lesson plans to the new system to demonstrate how it works.”
Part of the new plan is for school use of an AI program, MagicSchoolAI. English Teacher Kristin Day said her classes used the AI tool to help brainstorm ideas and edit assignments.
“I had it helping refine arguments at different steps along the way, but made sure that it wasn’t doing work for my students [entirely],” she said.
The results of this were distinctly positive, according to Day.
“The students seemed to find it incredibly helpful,” Day said. “The papers I got were much cleaner and more well written. [Due to this], I got papers that were far more obviously written by the students, while some of the ones I got earlier in the year were quite obviously not.”
With such benefits, the goal is that the proposed system will be put in place in the fall. While it was designed with the high school in mind, the hope is that the resources will inspire other schools in the district to adopt similar policies to continue the fight against AI cheating.
“We’re not trying to deprive students of what is going to be an essential tool for them, but we’re trying to teach them to use it responsibly,” Day said. “It’s a balancing act which we’re just trying to navigate.”
