Developing applications for children, particularly ones powered by artificial intelligence (AI), poses unique challenges. This is especially true when it comes to defining “age-appropriate” content and building systems that reliably enforce this. To explore these challenges in-depth, I will share the experience of developing Gramms, an AI-powered bedtime story app.
Gramms is an innovative app that generates personalized bedtime stories told in the cloned voice of a grandparent. The concept appears straightforward, but behind the scenes, the security architecture was a complex task. This article aims to share this hard-won insight with other developers creating AI-powered products for children, particularly those between the ages of 3 to 10.
Understanding Age Groups
One of the most important design decisions we made when developing Gramms was to abandon the idea of a single “child-safe” content layer. Children of different ages have different needs and interests, and a story that captivates a 9-year-old might terrify a 3-year-old. Conversely, a 9-year-old will quickly lose interest in content designed for a 4-year-old.
To address this, Gramms uses three developmental age groups, each with its own unique prompt engineering level. For example, stories for ages 3-5 use simple vocabulary and narratives, while stories for ages 9-10 introduce chapter-like structures, multi-stage plots, and ambiguous situations with nuanced resolutions. These age bands directly influence the system prompt that is sent to the language model, shaping every element of generation from word choice to narrative stakes.
Compliance with COPPA
The Children’s Online Privacy Protection Act (COPPA) requires verifiable parental consent before collecting personal information from children under 13. In the case of AI apps, this includes data like the child’s name, age, and interests. Apple underscores this requirement in its App Store Review Guidelines, requiring explicit disclosure of which third-party AI services process user data.
Gramms addresses this requirement at the time of parental consent. Before a profile is saved, the parent sees a screen that names OpenAI and Cartesia as our AI providers, explains what data each receives, and requires active confirmation. This is not a simple Terms of Service checkbox, but a crucial gatekeeping mechanism.
Automated Content Moderation
Despite careful engineering, language models are probabilistic systems, and a carefully constructed prompt can still result in inappropriate output. To address this, Gramms implements a two-pass system. The first pass is at the prompt level, prohibiting certain content, while the second round is moderation according to the generation. If a story fails moderation, the user will simply see a message that a new story is being created, preventing any inappropriate content from reaching the child or parent.
Apple App Store Review Process
Apple’s review process for apps in the Kids category is tougher than for general apps, and AI features require special review. It’s important to name each AI provider explicitly, confirm that your contracts with AI providers prohibit the use of child-related data for model training, and separate child profiles from adult accounts at the data layer. It’s also advisable to test the app with real children before submission, as this can provide more accurate feedback on age appropriateness.
The Bigger Picture
Developing AI products for children isn’t necessarily more difficult than for adults, but it’s challenging in unique ways. The design empathy required is significant, and each decision you make will shape the child’s experience. However, by carefully considering security architecture, naming vendors, building robust moderation systems, and designing thoughtful parental consent processes, you can build trust with families and create a product that is both safe and engaging.
As the founder of Gramms AI, I hope that these insights can inform and assist other developers in their journey to create AI-powered products for children. It’s a challenging but rewarding field, and with careful planning and consideration, it’s possible to create products that both children and parents will love.
Robin Singhvi, Gramms AI
Robin Singhvi is the sole founder of Gramms AI, an iOS app that generates personalized bedtime stories told in the cloned voice of a grandparent. Gramms is available in the Apple App Store. You can reach Robin at robin@gramms.ai.
For more insights on building AI for kids, visit Here.

