Helix: Exploring conversational AI design
Self-directed concept project | 20175-6-year-olds' emotions change on a whim. Meet Helix, the math bot designed to teach, challenge, and guide preschoolers through those emotional swings.
βI am a 5 year-old preschooler and a visual learner. I need a learning tool for math that is easy to understand and makes learning fun.β
β Dylan, A Kindergarten Student
Plot twist
This was 2020 (before AI got cool)
Fun fact: I designed Helix two years before ChatGPT made everyone suddenly care about AI. Back then, chatbots were basically fancy FAQs with personality disorders.
What made Helix different? He wasn't just answering questionsβhe was feeling things. Shake your device in frustration? Helix noticed. Take too long on a problem? He'd pop up with a silly emoji to lighten the mood. Long pause? Time for encouragement.
This was "agentic AI" before that became a buzzword. (And yes, I'm claiming I did it first. Fight me. π)
The stack: Dialogflow (RIP API.AI), Sketch, lots of sticky notes mapping 5-year-old emotions.
Why this still matters in 2026: The tech changed (hello LLMs!), but the thinking didn't. Whether I'm designing for Dialogflow or GPT-5, the question is the same: How do we build AI that gets humans, not just commands?
Spoiler: That question took me from Helix β Citibot β AWS Smart Assistant β Amazon Q. But we're getting ahead of ourselves. Let's rewind to 2020.
Initial discovery phase
The framework
How might we make learning new math concepts fun for a preschooler? The math bot, a.k.a. the math app personal assistant, seemed like an obvious use. Yet, Helix did not come to life until later in this project.
Looking into the competitive landscape for preschool math games. Here is a sampling of in-game cues found: 1) Audio instruction to understand directions and how game works, 2) cartoon hand illustrating game play, 3) pulsating and swishing to indicate actionable item, 4) objects automatically moving to indicate motion.
What is our game development opportunity for the real differentiator? Enter the chat bot, our app's role model and personal assistant. He is a parentβs companion for teaching their preschooler math concepts. Sometimes kids may tune out those closest to them. A new teacher is much welcomed.
Building A Chatbot
Developing Motor Skills
80% of children between ages 2 to 4 are mobile device users and proficient in all 7 gesturesβpinch, tap, flick, scroll, slide, drag & drop and spread.
Content is Not King
Keep text instructions to a minimum to avoid cognitive overload. Language should be simple and geared to users reading mastery at this age level.
Tuning Into Emotions
Learning to regulate emotions is more difficult for some children than others. They are more emotionally reactive and sometimes harder to calm down. Emotions can quickly swing from one spectrum to the other.
Workflow Sequences
The Gameplan
Interesting Fact: In the ancient mathematics, space was a geometric abstraction of the three-dimensional space observed in the everyday life.
Dream it.
How can we build an experience for a child with varying development milestones? I start to map gameplay sequences before developing any low-fidelity mockups and to really understand flow of the game. At this stage of UX process, a chatbot solution is not even on the radar. Functional areas like pulsating for game cues and shaking motions to clear (or start a problem) are explored. Also, the preliminary brainstorming had VR in the equation for a 360 degree virtual classroom.
Build it.
The name of the math app is SPACE. The 360 canvas would be the space where preschooler could slide numbered circular shapes on top of each other to form a planet and to solve the math equation. The more problems they complete the bigger their universe becomes.
Aha Moment
Whatβs the one thing that is predictable with preschoolers? Itβs how their emotions can go from one extreme to another instantaneously. Hereβs where Helix originates and the emotional journey map begins to evolve. The horizontal axis represents the steps through time and the vertical axis shows themes for analysis. Virtual post it notes start filling in everything from pain points to happy moments. Itβs all about understanding our user to create that meaningful experience. This will be the ultimate learning tool built on player empathy.
One, Two, Map My Chat
NATURAL LANGUAGE CONVERSATION FLOW
Diving into the emotional journey map led to the next phase of conversation and gameplay workflows. It is important here to understand what type of system or user actions would elicit certain conversation workflows. At what point in game will our preschooler get stuck? How can we redirect with math bot? Where do we need to insert in the game guidance and help? Inline prompts and auto-complete can help aid and keep the experience seamless with preschooler and mathbot. It is important to take into consideration different reading and comprehension levels at this age.
Helix, Got Game
SPEAKING IN EMOJI
The one thing we need to give our mathbot is a personality. He canβt be overly ntrusive and totally robotic. It makes sense for him to be a robot character in our SPACE math app. His personality is fun and on the level of a 5 to 6 year old. Use of emojis make it easy to communicate and grab preschooler's short attention span. Here is where we start to develop that preliminary language before building conversations in Dialogflow. Now is the time to start thinking about copy, style, voice and language.
Helix's personality type is agreeableness on the Big Five personality traits. This personality works best with preschooler whose emotions change from happy to frustration to temper tantrum on a whim. Able to talk on preschoolerβs level and redirect. Has a soft heart, playful and entertaining to keep 4-5 year old entertained.
Helix, to Ground Control
CONVERSATIONAL USER INTERFACE (CUI)
The intent or use of a chatbot is not just to strike up a friendly conversation. In human conversations there is an informal flow and may not always have a purpose, i.e. small talk. Chatbots have function and are task oriented. Their use is to be userβs personal assistant or guide in tasks/events and to make everything easy. Wait, this doesnβt mean our chatbot canβt have a personality. This is where the magic starts and considerations for preschoolers emotions come into play to build conversation samples.
There is a syntax in UI conversation scripts. I started building the simple responses for each utterance for the user and the botβs responses. Here are where the variances in conversation and different scenarios are created and AI training starts.
The piloting phase of building the Natural Processing Language (NPL) is shown in examples below.
Wait, how does this work in 2026?
The glow up: In 2020, I mapped every conversation path in Dialogflow. In 2026, I write a system prompt and let the LLM improvise. Same UX thinking, different machinery.
Then 2020:
Rule-based chatbots
Platform: Dialogflow (formerly API.AI)
How it worked: If user says X β Bot responds with Y. Every. Single. Path. Mapped.
Training: Manually write every conversation variant
Smart factor: Pattern matching + supervised learning
Limitation: Unexpected input = existential crisis
Now 2026:
LLM-powered agents
Platform: GPT-5, Claude Sonnet 4.5, Gemini 3 Pro
How it worked: System prompt + context β AI reasons about response
Training: Foundation models already understand language
Smart factor: Contextual reasoning + generative responses
Limitation: Unexpected input? AI improvises (sometimes too creativelyβhello, hallucinations)
But Here's the Secret...
The UX thinking is literally the same.
2020 (Dialogflow)
Intent: frustrated_user
Triggers: device shake, long pause
Response: "Hey buddy! π€ Want to take a break?"
Helix 2026 (if I rebuilt with LLMs)
System: You detect user frustration through device signals
Context: User shook device, 45-sec pause Response: [AI generates empathetic response]
See? Same thinking, different machinery. This is why my 2020 work on Helix looked exactly like my 2024 work on Gen AI experiences for AWS productsβthe paradigm shifted, but the human-centered design stayed put.
AI tech changes every 6 months. Human emotions? Still the same since forever. Design for humans, not for hype cycles.
Letβs Go
Here's where I start creating game actions to start filling in conversations for each trigger API.AI. The onboarding is where we would educate and introduce how messaging with Helix in gameplay works. The anticipated emotional state of user would be indifferent at game start. The more our user plays game and interacts with bot, the better Helix can learn his preferences and tweak the experience over time.
Helix can be voice activated or summoned with in app messaging. System triggers can also start the conversation with user.
I wanted to focus on the microcopy to have Helix possess his own unique personality. When a preschooler is upset they throw tantrums and can hurl stuff too. If device is shaken or some abrupt movement happens, this could be system trigger for Helix to message something funny or silly emoji or simply ask "are you ok?"
Good Intent-ions
The gameplay conversation scenarios are built into the API and different conversation variables developed.
π EMOTION: NEUTRAL
Personalized greeting.
During onboarding parent answers quiz to inform algorithm about math aptitude, plus insight into temperament and personality.
π€ πβEMOTION: CONFUSED
Text or voice cue of help (similar NLP) or emoji symbols to start conversation.
System actions or events, plus user actions trigger tutorial video screen load for show and tell.
Mathbot shows tutorials with micro interactions for ways to merge shapes and solve part/part/whole concept.
π’π¬π‘EMOTION: FRUSTRUSTION
Mathbot pops up to give positive reinforcement and redirect.
Shaking mobile device or a long pause signals user may be frustrated or stuck in gameplay.
ππ½ π β€οΈ EMOTION: HAPPY
Scoring and leader board is private and only parent and player can see.
Player can type message or enable dictation to start conversation.
What Helix taught me
Five years later, these three insights from Helix still shape everything I design:
Personality = Psychology
Helix's "agreeableness" wasn't decorationβit was deliberate emotional design. That same thinking shaped CitiBot's trust frameworks and Amazon Q's developer interactions. AI personality should match emotional context, not just task requirements.
Emotion-first beats feature-first
I mapped emotions before features. Understanding how users feel tells you what they need. Whether it's a frustrated preschooler or a stressed DevOps engineer at 2am, emotion drives behavior
Vulnerable users teach you everything
Designing for preschoolers forced me to strip interactions to their essence. No jargon. No assumptions. That constraint makes me better at designing enterprise AI. If a preschooler couldn't understand it, I'm overcomplicating it.