Recently, I was talking to an EdTech founder about the viability of chatbots in K12 schools, and it made me think of a road trip I did with a good friend of mine ten years ago. Naturally.
On the way from California to Colorado, we were approaching our campsite in between Zion and Bryce Canyon in Utah when the rain started falling. We needed to set up our tent before it really started coming down, so we moved with expediency, motivated by beating the weather as much as by getting something to eat. Tent up. Rainfly secured. Dinner time.
The pooling puddles in the middle of the night woke us up. Unfortunately, they were inside rather than outside. Big bummer. We had set up the rainfly inside out. And by “we” I mean me. We fixed the problem (actual “we”), quarantined the wet as best we could, and I stewed in my annoyance with myself the rest of the night since there was not much sleep to get at that point.
A decade later, it’s one of the stories we tell most often. We don’t talk about the most gorgeous Angels Landing hike. We don’t talk about the alien landscape of Bryce Canyon. We talk about that time I didn’t know how tents work. It’s pretty great.
Just like Type II Fun, education is inherently a Type II experience; whatever we teach students here and now is meant to be applied later. Third grade math has a direct connection to middle school algebra. Writing a cogent argument in sophomore year requires an understanding of sentence structure and grammar before evidence and commentary. Even the need to socialize that is driving phone bans has a bigger purpose. School as a whole is built to prepare students for their lives when they leave. It’s more about the after than it is about the now. Education is the camping, but learning is the story.
So, when we were chatting about chatbots, I couldn’t help but think about the disconnect between real-life pedagogy and EdTech expectations (EdTech-tations?) of efficiency and implementation. Schools are notoriously slow to change, sure. Some of that is a challenge that needs addressing, but some of that is purposeful. It’s important to give students experiences that require time, have no clear answer, and force synapses to build almost-connections. Productive struggle is as integral to learning concepts as it is to learning life. Failing at bad-weather tenting is one of the most important ways to make sure it never happens again (it hasn’t, please know).
Of course, not all struggle is productive. Not all of education’s slowness is worth keeping. That is some of the most value AI is bringing to the education road trip. It is finally forcing schools to question content and pedagogy instead of saving it for next year. Examining that—questioning what we teach and how we teach it—is integral to moving education forward. Long has education been riddled with busy work, for students, teachers, and administrators. As AI catalyzes a faster educational evolution than has been seen before, some of that work is being replaced by more purposeful, efficient learning experiences. Nonetheless, efficiency and productivity need to be balanced by slow, methodical learning.

In K12 education, we play the long game with intrinsic motivation. Although we talk to students about embracing mistakes as opportunities, about seeing the long-term value in that assessment gone awry, most K12 educators are realists. They know that grades matter to students. They know that mistakes are more tense-lip embarrassment than embraced opportunity for growth. Still, they walk students through those life lessons not for the here-and-now but for sometime in the future, likely when the students are long gone. They trust that, somewhere down the line, maybe in another class or in a work meeting years later, that student will understand some aspect of what they heard, what they experienced.
So when it comes to chatbots, I can just imagine answer-focused early adolescents vigorously typing What’s the answer? enough times until that gets boring. Students don’t naturally perform the skill of patient learning. They need adults around them, adults whom they know and trust (like is a different story), to guide them through the process. For a majority of their educational career, K12 students are concrete thinkers. Only until they have developed enough content skill, patience, and the ability to reflect does conversing with an AI tool make sense.
Of course, there is nuance. SchoolAI has the ability to incorporate guardrails into the student-robot conversation that keep students within the campsite boundary of productive interaction (check out Holly Clark’s classroom example in Edutopia). Playlab is also doing good work in helping schools create tools that meaningfully work for them and their context. Getting to that point, though, requires time, energy, and bandwidth, all of which are in short supply in schools.
I do see potential value in a range of AI tools that students interface with. I see potential value in tools that can catalyze 3rd grade math skills that impact seventh grade down the line. I see value in tools that can do the same for grammar and sentence structure that can impact a cogent essay. The real value in those tools, though, will never be in replacing the struggle. It will be in getting to the struggle more quickly.
This article captures an essential truth about K12 education: we play the long game when it comes to fostering intrinsic motivation. Self-Determination Theory (SDT) explains the continuum of motivation, from external rewards like grades to the intrinsic satisfaction of learning for its own sake. Most K12 students, as concrete thinkers, are still in the early stages of this continuum. They rely on trusted adults to help them navigate the messy, uncomfortable process of learning from mistakes. AI tools, while valuable, must complement this developmental journey by supporting—not bypassing—the struggle. As you pointed out, the true potential of AI is in accelerating students to the point of productive struggle, where growth and understanding take root.
The idea of guardrails for AI in schools also strikes an important chord. For AI to serve as a meaningful tool, it must account for the diverse needs of students and their varying stages of motivation. Tools that focus on targeted skill-building, like math fluency or grammar, can have an immense impact when they align with the broader goals of education: autonomy, competence, and relatedness. However, as you noted, implementing these tools effectively requires time, energy, and bandwidth—resources often in short supply. Thoughtful integration and teacher involvement remain crucial to ensuring these tools enhance learning without eroding the human connection that underpins trust, curiosity, and long-term engagement.
Am I on the right track?
Instantly reminded me of my first introduction to the Fun Scale. On the REI blog of all places.:
https://www.rei.com/blog/climb/fun-scale
Most of my formative experiences have been of the Type II variety, including in education.
Even a few Type IIIs have done some heavy lifting, now that I think about it…