AI and Education
What is the appropriate role of AI in the classroom? How do we prevent students from becoming ChatGPT-regurgitating zombies?
22 May 2025
A new topic has been frequenting my conversations and regularly-scheduled podcasts: what are teachers supposed to do about AI? How do we prevent students from becoming zombies, regurgitating the outputs of ChatGPT, without actually learning anything?
"AI" is the wrong word.
Before I divulge my thoughts, I want to set some things straight. As a math/computer science person, perhaps I'm more picky about this than I need to be, but fostering intelligent conversation about this nuanced topic requires defining what we're even talking about. AI is not new, at least not within the last couple years. Artificial Intelligence is any computer that has the capacity to do human work, and has been around for nearly a century, beginning with the Turing Test and early “feedbacks loops” in the 1950s. Why is AI a hot topic now then? While “AI” is not new, what is new is generative AI and massive LLMs (large language models). Today, people use the term “AI” when referring mostly to these massive generative AI models like ChatGPT, Claude, Gemini, etc. For the first time, AI models like ChatGPT are directly catered to the general public, so, for the first time, the term “AI” is in everyday speech. Like the internet, their target audience is… everyone. So, throughout this blog post, if I use the term “AI”, I am only adopting the colloquial habit of referring to these LLMs with their blanket t erm, while also acknowledging these models are only a subset of what constitutes artificial intelligence. Just had to make that clear.
The two extremes and why they don't work
Now let's talk. The people are worried. What are we supposed to do about AI in schools? Many educators have taken wildly different approaches. One on extreme, Tyler Cowen, economist and educator at George Mason university, requires all of his students to use AI on their assignments, emphasizing high expectations on the end result, and relieving weight off the process. This approach works a lot better in the workplace, where the output is more valuable than the process. However, in the classroom, it is not a controversial opinion that the process is valuable. Unless Cowen has rewritten all of his curriculum in a magical way that ensures learning with the use of AI, it's likely that the students are not learning as much.
On the other extreme is the more common approach of completely banning AI in classrooms. While I do believe this is a fear-based response, I understand where it comes from. Curriculum today is not equipped to deal with the sudden accessibility of all the world's knowledge freely and immediately available. In my undergrad and master's programs, most of my professors banned the use of AI in completing assignments entirely, stitching addendums onto their 20-year-old plagiarism guidelines. However, this creates an incentive to the dishonest and a punishment for the honest. Cheating has always been a problem, but AI has inflated this problem greatly to a point where educators don't know how to respond. Like Prohibition in America, in the 1920's, prohibiting a behavior does not necessarily eliminate it. In the setting of a classroom, this prohibition will in effect, punish the honest. In addition, the complete ban of AI is not a sustainable approach. Imagine if a complete ban of the internet in classrooms had been ordained during the early 90s. It would have only delayed the progression of education to catch up with the progression of technology. So, while I understand the avoidant response of educators to AI, I would discourage this extreme.
Then what is the right balance? How do we ensure that students are learning? My mom, a math professor at the community college near our house, teaches calculus and linear algebra: entry level math college courses that attract a wide variety of students. She noted a decline in exam scores over the past few years, indicating an over-reliance on AI in completing assignments. What do we do to prevent this?
We're right between the past and the future
I've thought a lot about this issue. Like most things, I believe the solution to AI and schools is somewhere in between the two extremes of mandating and prohibiting AI use in learning. Moderation, baby! I don't have concrete answers, and I can't prescribe a fix-all answer. What I do know is we are in a gray area.
The Past. Before these huge LLM's became publicly and freely available, academic dishonesty was a smaller battle, but a battle nonetheless. Educators fought the use of the internet in assignments, but I would argue that only deeply uninspired assignments fell victim to Googled answers.
The Future. In the future, I predict we will see a similar adaptation. Curriculum and assignments will adapt to the rise of generative AI, and only uninspired assignments will be victim to chatbot-generated answers. We will learn to value creativity and the learning process as opposed to exactness and final products. I don't know exactly what education will look like in 20 years, but I am confident in the innovation and inspiration of our educators to design solutions that foster learning without a fear-driven rejection of AI.
The Present. So today, we are in quite an interesting spot. These generative AI chatbots have become available so quickly that the curriculum has not caught up. Because of this, we are in a sort of technological Purge as the ethics and moral implications of AI utilization are still being ironed out. These next few years will be gray, but we cannot afford to simply throw out the education of this decade's students. Like I said, I don't have all the answers, but let's discuss some of the things that have been rolling around in my head:
Where do students use AI?
Students use AI to complete homework assignments. These may be essays, worksheets, problem sets, etc. Great. Now we can narrow in on this subgroup of education. This also hopefully eases some anxieties. AI is not a threat to all of education. Lectures, class activities, and exams have remained relatively untouched, at least from the student participation end. We just need to hone in on how to create a healthy discussion about AI on homework assignments.
Why would a student use AI on these assignments?
Simply put, I've used AI on assignments that I don't feel necessary to complete on my own. When I don't feel it's necessary to my education or to my learning. This happens because of two reasons:
- The assignment truly isn't necessary. Educators, eliminate those assignments immediately. Easy peasy.
- The assignment is necessary, but the student doesn't understand why. Educators, explain why the assignment is important, even if it's as simple as: “you'll be tested on this.” Spend more time being intentional with assignments. The rise of AI is giving us a good opportunity to re-evaluate curriculum, especially the approach we take to teach concepts.
Students may also use AI on assignments they don't know how to do. This creates a cycle of hurt for the student, that can often go long undiagnosed.
What do we do about it?
Now that we know where the root of the problem is, we can prescribe a solution easier. Students use AI on assignments when they do not grasp the importance of the assignment. Here's some things I would do if I were an educator in 2025.
- Learn how to look out for AI-generated solutions, and open an honest discussion about what use is appropriate. If a student uses AI in a setting they were instructed not to do so, this is a form of cheating, and should be treated as such. However, AI is not inherently bad. In an art history class, I used AI to check the grammar of a paper I wrote. My professor noticed I had used AI and sent an email asking how I had used it. I explained I had used it to check grammar, find synonyms, and other editing. I did not feel “caught” because of the open and honest discussion this professor had fostered. The important thing is I was able to write a better essay, while still meeting all the learning goals of the assignment. In my opinion, that is the optimal union of AI and learning. In another class, a student asked what AI use was appropriate. The professor turned around and asked us the same question. With this discussion, he forced us to evaluate what parts of our education were important to us. He forced us to be intentional with our use of AI. He forced us to think ethically about the intersection of AI and education.
- Create assignments that focus on the process of learning, not just the end result. Because my background is in math, I think of this in the context of the math results. Create assignments where students have to step through the problem and show their work.
- Discuss the importance of each assignment. “This will be on the exam.”, “This will be necessary to understand as it is foundational for the next unit.”, “This is a core part of the class and is necessary to pass.”
- Make in-person time more interactive. Worst case scenario: every student uses AI on their homework and learns nothing from out-of-classroom assignments. This is obviously incredibly unlikely, but this would motivate an educator to optimize their in-person class time. Instead of using class time to lecture to the void for an hour, create activities that get students actively engaged in their learning. Get students out of their seats. Foster class discussions. Create hands-on activities. Active learning. With the use of AI, homework may become more passive. Make class time more active. We discussed earlier that students turn to AI when they don't know how to do an assignment. When a lecture is dry, the student may tune out because they know they can just use AI on the assignment later. Create lectures that are difficult to tune out. The students may hate it at first, but they will be grateful before the first midterm.
While thinking of this problem, I'm reminded of a poem by Joseph Fasano:
For a Student Who Used AI to Write a Paper
Now I let if fall back
in the grasses.
I hear you. I know
this life is hard now.
I know your days are precious
on this earth.
But what are you trying
to be free of?
The living? The miraculous
task of it?
Love is for the ones who love the work.
A philosophical discussion
With the answers to nearly every question at our fingertips, we, as students, are faced with a new opportunity. The opportunity to be more intentional than we've ever had to be with out learning. Let's get philosophical, shall we? Why do we even care about learning? If the answer to anything you can learn in school is freely available, why prioritize learning things yourself anyway?
BYU's motto is “enter to learn, go forth to serve.” We do not learn things in school to regurgitate them in the workplace. We learn things in school to create solutions, solve problems, and connect dots in our future endeavors, whether that be in the workplace or in the home. Our role as humans on this earth is not—should not—be robotic. We learn so we can teach the next generation more than what the previous generation could teach us.
AI is not deity. It is not supernatural. It is technology created by humans. It is not the pot of gold at the end of the rainbow of human innovation that provides all the answers to the rest of humanity's problems. It is a marker of human ingenuity, a milestone—a big milestone, yes—in the timeline of human invention that extends far past 2025. My favorite quote of all time, by David Deutsch:
Base metals can be transmuted into gold by stars, and by intelligent beings who understand the processes that power stars, but by nothing else in the universe.
AI is not a star we discovered, it is the gold we created. We cannot revere it as a replacement for human creativity when it is itself a product of human creativity. It is an incredible stride in technology and a tool to do research and increase efficiency in mundane tasks, but it is not the grand finale of human invention. There is much more coming. It is simply our nature. We produce, optimize, expedite, and create.
AI was created by intelligent humans. The learning that led to creation of these incredible models began in a classroom. Intentional education, now more than ever, is necessary to continue this momentum. Ethical discussions, now more than ever, are necessary to continue this momentum. Re-evaluating the purpose of education and learning, now more than ever, is necessary to continue this momentum.
←