Four things an educational psychologist wants you to know about AI in the classroom

By David Williamson Shaffer


It’s back-to-school time, and as the school year begins, some parents and educators have been wondering about how to handle artificial intelligence in the classroom.

As a former classroom teacher, a parent, and someone who studies technology and learning at the University of Wisconsin–Madison’s School of Education, I’d like to share a few suggestions — and some assurances — about what to expect from AI in the near future.

1. It’s not as difficult as you might think to outfox AI.

Yes, AI can produce passable, seemingly knowledgeable written content on a lot of topics, but it also has clear limitations. If you know what those limitations are, you can create assignments on which AI will either not be able to perform well or not be able to perform at all.

This is because the new generation of large language models like ChatGPT don’t actually understand what they’re saying. They simply use everything they can find on the internet to make their best guess at which words are most likely to look like an answer to your question — whether the words are right or not.

Large language models have no critical thinking skills, no “real world” context, and no personal experiences.

So, if teachers give fewer assignments that focus on things AI can do — like repeating definitions or summarizing well-known texts — and more assignments that require things AI can’t do — like thinking critically, reflecting on personal history or experience, or writing about things that are not well-known — students will have to think for themselves and reserve AI for things like summarizing information or correcting bad prose.

2. We can use AI for good — in the world and in classrooms.

Shifting lessons away from what AI can do has an additional benefit: The lessons themselves can become more thoughtful and engaging.

Recently, a professor of world religions at Northern Michigan University realized ChatGPT can write passable essays about the morality of burqa bans. Rather than ban ChatGPT or make students write about something else, the professor asked students to think critically about how the chatbot responds to questions about religion and ethics.

In other words, the current generation of AI tools make it possible for students to do less busywork and focus more on the things that will matter in the world AI is shaping. Large language models can provide something to critique, solving the problem of staring at a blank page. AI can help students answer questions that a teacher doesn’t have time to address (although best to double-check the answer!). It can help sharpen students’ prose. It can help them find resources.

Innovators around the world have already harnessed artificial intelligence to do great things. One group, AI for Good, is using the latest technology to predict missile attacks on Ukrainian homes and create poems about self-determination for Afghani women in Pashto, Farsi, and Uzbek. Teen entrepreneur Christine Zhao uses ChatGPT in an app she created to help people with alexithymia (people who can’t relate to their own emotions) develop emotional awareness and build interpersonal relationships.

3. It’s actually important that students learn to use AI well.

The American philosopher and educator Mortimer Adler said, “Deep thinking is not just about finding answers; it’s about asking the right questions.”

It turns out that getting a large language model like ChatGPT to say something sensible about a complex topic is actually quite difficult. The new field of “prompt engineering” pays people six-figure salaries to create the requests that will get good responses from AI tools.

I’m not suggesting that we develop whole classes about how to get useful output from ChatGPT. In the early days of the internet, there was concern that people needed to learn how to “write good search queries.” That didn’t turn out to be such a big problem.

But learning how to work with AI in ways that are effective and ethical will be important life skills in the decades to come.

As educators (and parents and students) we should ask ourselves: What are professionals — and other people in the “real world” — doing with the latest AI tools? Are they using the tools productively and responsibly? And what do we, as educators, need to empower students to use AI to solve the complex social, economic, scientific, moral, and environmental issues they will face in the coming decades?

4. Try it before you buy it.

In my time as teacher, education researcher, and professor, there have been other innovations that people wrongfully thought would turn students into mindless zombies.

When I started teaching, my peers said students needed to learn to do arithmetic quickly because they “won’t always have a calculator in their pocket” or they might need to “make change for a customer if the cash register isn’t working.”

Well, it turned out that we all have calculators on our cell phones now, no one uses cash anymore, and if the register goes offline, they close the store.

The right question is not, “How can we stop these new tools from ruining children’s education?” Rather, we should ask, “How can we use these new tools to make education better?”

In the end, it will be thoughtful educators and parents who figure out the best ways to do that. But you can’t use a technology for good if you don’t know how it works.

So, if you haven’t played with ChatGPT, or Bing, or Perplexity, or Bard, or LLaMa or any of the growing chorus of strangely named AI tools, try one and see what you can do with it.

Try asking it to write a lesson plan. You might be pleasantly surprised both by how helpful it can be and also by the limitations of what it produces. Try giving AI one of your assignments and see what comes back. What grade would you give the AI’s response — and how much work would you have to help it get an A?

In the end, to use AI well in the classroom, we have to start by using it badly outside the classroom.

As a profession, we’ve done that with other innovations. Now it’s time to do it again with the latest new technology.

David Williamson Shaffer is the Sears Bascom Professor of Learning Analytics and the Vilas Distinguished Achievement Professor of Learning Sciences at the University of Wisconsin–Madison. This article originally appeared at

Interested in learning more? Join David Williamson Schaffer for a webinar on November 8, 2023.

Modeling Learning in the Age of Chat GPT

in partnership with the Society for Learning Analytics Research

Wednesday, November 8th at 12pm CST

With David Williamson Shaffer, PhD

ChatGPT is the new (and most well-known) AI tool that can whip up an essay, a poem, a bit of advertising copy—and a steady boil of hype and worry about what this will mean for education in the future. This talk looks at what ChatGPT and AI models are really doing, what that means for the future of education — and how we can model, study, and assess learning in the world that ChatGPT is helping us create. Join us for a Feature Webinar, in partnership with the Society for Learning Analytics Research (SoLAR) as faculty director David Williamson Shaffer discusses implications and potential use for harnessing ChatGPT.

Register here for 11/8

Pin It on Pinterest