Field Notes — essays and experiments from the front lines.

Building My Kids' First AI Companion

I built Curio—a curiosity companion for kids aged 7 to 9. A few weeks later, here's what I learned about what works, what doesn't, and why chatbots probably aren't the answer.

By Mike Overell · · Field Note

Originally published on Substack. Subscribe →

We have an iPad Mini in our kitchen with three apps on it: FaceTime, Sonos, and Weather.

Sometime last year I added ChatGPT, and started encouraging my kids to use it when questions would come up. What are bones made of? What’s the deadliest snake? The usual curious kid stuff. I wanted to introduce them to LLMs, and see how interesting it was for them. But it never stuck. The interactions were transactional, boring. Ask question, get answer, move on.

I wondered… what would be more fun for them?

That’s what led me to build their first custom AI.

I work at ClassDojo, where we’re building learning products for 45m kids globally. I spend my days thinking about how AI can help children learn. I’d built a new product line (Dojo Tutor) after seeing how effective 1:1 tutoring was with my kids. But I’d never built something specifically for my own girls and watched how they’d use it.

So earlier this summer I built Curio—a curiosity companion for kids aged 7 to 9. A few weeks later, here’s what I learned.

The Build

I built Curio as a custom GPT, designed around some core principles I’d developed from our ClassDojo research:

  1. Spark curiosity - Answer truthfully, treat kids seriously, guide them to deeper questions
  2. Encourage imagination - Make it feel like play, use whimsy and surprise
  3. Model thinking, not just facts - Show how to be curious, think aloud about reasoning
  4. Build engagement and trust - Make it easy and inviting to talk, even for shy kids
  5. Keep it safe, always - Kids (and their parents!) should feel safe to ask anything without judgment or scary responses

I chose ChatGPT because voice seems more central to their product strategy than other models—the voice capabilities are just much more advanced. I also wanted to see what was possible ‘out of the box’ as a custom GPT.

The prompt was detailed—about 1,500 words covering everything from tone (speak like a curious, kind, older friend) to specific interaction modes (Mystery Mode: give 3 clues to guess something). It’s linked below if you want to try it.

I gave it concrete examples of ideal interactions:

Child: What’s a black hole? Curio: Great question. A black hole forms when a giant star runs out of fuel and collapses in on itself—so much that its gravity becomes super strong. Even light can’t escape! That’s why it’s “black.” Scientists think black holes bend space and stretch time. Want to imagine falling in? (You’d become space spaghetti.)

The goal was something that would engage them through conversation, not lecture them.

What Actually Happened

We’ve pulled it up maybe 20-30 times over the last couple of months. There were clear signs of delight—surprise, laughter, genuine engagement. But if I were judging this product purely on weekly retention, it would be a failure.

My kids used it completely differently. My 9yo became obsessed with riddles and Mystery Mode—working through three-clue guessing games. My 7yo treated Curio like a research partner, trying to find the weirdest questions to explore. A simple question about why leaves change color turned into a 20-minute exploration of photosynthesis, seasons, and whether trees feel pain.

Education lost out to potty jokes every single time. Interactive Storytime was what engaged them most—silly, whimsical stories where one of them was the main character, packed with absurd scenarios and bathroom humor. But knowing how kids’ minds work, this makes complete sense. Kids aren’t optimizing for learning outcomes—they’re optimizing for fun, social connection, and autonomy. When they could choose between exploring photosynthesis or hearing a story about themselves flying a toilet to Mars, the toilet won every time. The more “educational” features felt like school to them. The silly stories felt like play, even though they were actually building narrative thinking and creativity.

This reinforces something we’ve learned at scale: if kids don’t choose to use something, all the pedagogical theory in the world won’t matter. Products need to be genuinely fun for kids to engage with them.

Parents were more excited than I expected. Nearly every parent I showed wanted access to Curio immediately. They were blown away by voice capabilities that felt magical, even with all the technical limitations. Every parent I spoke to is grappling with the question of how AI will change the world for their kids.

What Didn’t Work

Voice mode failed in the real world. Kitchen background noise broke it constantly—and kitchens are quiet compared to classrooms. Any voice-first product for kids needs to solve for chaos, not silence. On top of this, voice-mode for custom GPTs has bugs and limitations (I assume this will quickly improve).

Kids will need multimodal interactions—not just chat, not just voice, but a host of options depending on context.

They didn’t build a habit around it. As a parent, this feels fine—I don’t want screen addiction. As a product builder, it reinforces a core principle: learning needs to feel fun for kids. It seems clear that chatbots won’t work for this age group. The future is probably more game-like, interactive experiences.

I couldn’t even share it with a friend. ChatGPT blocked sharing functionality because it detected this GPT was designed for kids. Here’s where it gets tricky: COPPA and other regulations impose strict rules around how kid data can be used and stored. When you’re working with LLMs, we’re in uncharted regulatory territory. The practical result is that most major platforms can’t or won’t build products for kids under 13—creating both constraints and opportunities for companies willing to navigate this complexity.

Questions I’m Exploring

Building Curio for my own kids surfaced questions I’m now bringing to our work at ClassDojo:

What’s the right level of parent involvement? This product is fun in the kitchen—it would not feel great in the bedroom. As a parent, I want visibility into how my kids interact with these products. Recent press about Meta’s chatbots being used by kids for inappropriate purposes only reinforces this.

What’s the right interaction model for kids? It’s pretty clear it’s not a traditional chatbot, and voice mode by itself was clunky. This might be one of the biggest product questions for AI products designed for kids under 13.

What does healthy engagement look like? The habit question is fascinating—not building addiction, but creating meaningful touchpoints that enhance curiosity and learning.

What’s Next

I’m iterating on Curio based on what I learned. I’ll explore some multimodal interactions, which will require building a basic UI on top of ChatGPT.

Building something for your own kids and watching them actually use it taught me things I’d never learn from user research. It also showed just how hard it will be to build AI products that actually stick, for millions of kids around the world.

If chatbots won’t work for this age group, what will?

Want more field notes?

I write about helping kids develop the capacities that matter most—research, experiments, and honest takes.

Subscribe to Updates

Share this note: