AI is No Fun. That's Fantastic.
Two examples of AI use in a PD session show where it fits and doesn't in education.
When a teacher sneaks in ChatGPT into a professional development activity, you know things are changing. When she laughs at the realization that it can only get her so far, you know some things are not. It’s a push and pull with AI in education.
Last week, I co-facilitated a professional development session with 75 people in the room. The session was not about AI but highlighted the crux of the issue(s) with integrating AI into schools. It was used twice in the day (as far as I know), once by a participant surreptitiously and once by us strategically. Both instances demonstrated the opportunities and drawbacks of AI in education. More importantly, they demonstrated why schools are transformational places.
The Hollow Lines of AI
One of the activities required individual work that then led to small-group collaboration. Each member of the group had to interpret a different dataset related to education, ranging from reading scores to student engagement trends. They each then had to share with the group how they interpreted their respective data (Here’s what I think this says). Finally, each small group had to collaborate to summarize everyone’s interpretation into one newspaper headline to share with the others (Together, our group thinks this shows…). Objective data and subjective interpretation. Individual work and community-building.
As I walked around the room, I caught eyes with an educator who had that look—the I’m definitely not doing something I shouldn’t be doing so I’ll smile and hope whatever you see me doing is okay since it’s totally okay and we’re all adults here, right?
Interesting.
Before I could even figure out what she was silently asking permission to do, she sheepishly showed me ChatGPT on her phone and said she used it to interpret her data because data analysis is hard for her. Totally fair! It’s hard for this former English teacher as well. I hadn’t even figured out I felt about the unexpected visit from ChatGPT when she quickly followed up with the best part of it all:
She still needed to work to understand the data, herself, because in a few minutes she had to share her interpretation with the group, and she A) didn’t know if ChatGPT was right and B) needed to be able to speak to it confidently regardless.
On the road to understanding, she took a u-turn back to her own interpretation, limited as she felt it was, because she didn’t want to lead the group astray. More so, she wanted to participate fully in the exercise, and reciting an AI output felt hollow. Leaning on her own interpretation, unsure as she was about it, felt more authentic than reciting lines from the ChatGPT script. She understood the deeper purpose of the community-building activity.
I’m pretty sure I clapped at that point. At least in my heart.

The exercise required her to use a skill she wasn’t confident in, and she needed help. So, she asked ChatGPT. The problem was that it could only take her so far. The rest of the journey required leaning into discomfort, trusting herself as much as she could, and working with other people. Those are entirely human experiences that come with entirely human feelings, some of which are fun and some of which are not. All of them, though, are authentic and make for meaningful experiences.
The goal of the activity was not data, it was collaboration. Just as important, the activity was a fun exercise meant to create a sense of community.
Though ChatGPT can type out an answer, it doesn’t know how to have fun or build connections with other people. And, schools are people places. There’s a time and place for quick synthesis of information, but teaching and learning is about relationships. The real work is in teaching students what those feelings feel like—why trusting your own insecure self makes for a more meaningful life than reciting lines from an AI output.
A Shortcut to Group Work
Later in the day, a separate activity required everyone in the room to share the 3 skills or mindsets they saw as necessary for students to succeed in and out of the classroom (e.g. empathy, collaboration, critical thinking). We were aiming to land on 6 before the day was out, and with 75 people in the room, that is a tall order.
At least, it used to be.
After participants submitted their answers digitally, we took a 5 minute break (The actual exercise was more collaborative and involved, but we don’t need to recreate everything in this Substack, ya know?). In those five minutes, we had Claude identify the top 10, prompting it to group according to common themes or overlapping concepts. A quick double-check showed that AI did a good job doing what we wanted it to do. AI doesn’t know what fun is, but it does excel at objective sorting and listing.
The list of 10 was not worded exactly right, but in less than a minute, AI was able to do what would have taken the three of us at least a half an hour and many frustrating mistakes. In a professional development session, that is extremely valuable time. Repurposing that time into more meaningful work is a big win.

When they came back five minutes later, the group ranked their favorite six of the ten skills. They then spent a good chunk of time again in small groups delving more deeply into what those six skills actually meant to each person, how they manifested in and out of the classroom and why they were important for students to develop. That was the real marrow of the day—different people with different perspectives discussing and productively disagreeing on what education looks like. The room was filled with laughter, with second thoughts, with realizations, with curiosities, with connection, and it was all in the service of improving education for students.
Educators were grappling with what “problem solver” means in and out of the classroom, and they debated what made that different from “critical thinker.” It was conversation. It was discourse. It was fun. It was frustrating. Most importantly, it was meaningful work.
AI enabled us to get to the heart of the day more quickly. It catalyzed our ability to get a group of 75 people working together to discuss, disagree, and explore what skills and mindsets are important to the community and what that looks like in and out of the classroom.
We used AI to help, but only insofar as it got us to a meaningful exercise more quickly. Everything else—which is to say most of the day—was about people. It was about a community doing the hard work of collaboration in order to move the needle of education. Buying back that half an hour was invaluable in that respect. More importantly, it didn’t come at the cost of building connection, but rather, it enabled us to spend more time on it.
The Best Part
Education isn’t quick and easy prompting and sorting. Education is collaboration. It’s building relationships and understanding, and it’s doing that individually and collectively. Schools are places where students learn how to build connections with ideas, within themselves, and with other people. That is damn hard. They are places where adults spend countless hours worrying about making those connections stronger, and they are places filled with different opinions about how to do so. That is also damn hard.
Schools are complex places filled with complex people working to help students become more prepared for a complex world. The primary way they do that is through relationships. AI can help in the process, but it can only get you so far. The rest of the journey requires you and the people around you, imperfect as we all are.
That’s the best part.
I appreciate the effort, Danny. I know your mantra is “It’s the relationship,” and here you present the bot in a real setting. Was the bot explicitly forbidden? Why did the person feel sheepish about using her cell phone? Sometimes, ya’ know, it’s important to dig a little deeper into the moment. You leave me with more questions than answers. I know, I know. That’s a good thing. It’s all in fun.