How I Turned My French Vocabulary Chatbot Into a Custom GPT

Group of people reading and conversing in a cozy French-style bookstore café, with a woman reading a French book in the foreground and a man drinking coffee, evoking immersive language learning and cultural exploration.

Le Robot François started as a mobile-friendly flashcard chatbot built for language learners. While studying French, I created a vocabulary practice tool using ManyChat and Facebook Messenger. The goal was clear: enable consistent language repetition using a chatbot interface.

The chatbot gained unexpected attention from French-speaking users globally. I hadn’t optimized it for organic discovery, but it began receiving regular traffic from the Francophone diaspora. The initial tech stack was lightweight, fast, and user-friendly.

However, as the project scaled, two key problems emerged: rising operational costs and compliance changes from Meta. Managing policy shifts, integrations, and automation logic introduced constant overhead. Since my French instruction was generously provided to me at no cost, I personally committed to not directly monetizing this tool, which made the growing costs unattractive.

Eventually, I shelved the project.

Years later, OpenAI released custom GPTs. This reignited the idea. I now had the tools to relaunch the language learning assistant using modern AI infrastructure, prompt driven logic, and better cost control. So I transitioned everything into a custom GPT and published it in the GPT Store.

Why I Chose ChatGPT over ManyChat

Custom GPTs solved my previous limitations. They offered scalable logic, no-code development, and native language modeling. With ChatGPT, I could:

  • Remove hardcoded flows and switch to natural language interaction
  • Iterate faster using structured prompt engineering
  • Build a reusable educational tool for French vocabulary practice

It aligned with modern product principles: frictionless UX, efficient dev cycles, and scalable architecture.

How I Rebuilt this French Vocabulary Trainer

My rebuild centered around a well-structured master prompt. This controlled tone, behavior, vocabulary structure, and learning flow. I added:

  • Security rules to safeguard usage
  • Attribution messaging that rotated between English and French with a link to TimothyNishimura.com
  • A signature easter egg for traceability

Initially, I wanted CSV uploads to drive segmented vocab lists, but ChatGPT’s memory quirks led to occasional data loss. I dropped the external data layer in favor of embedded vocab clusters directly in the prompt.

Key Learnings from the Refactor

Complexity is fragile. Simple architectures scale better. I replaced webhook-driven automations with prompt logic. No third-party storage. No platform lock-in. Everything lives in the prompt. Results:

  • Lower costs
  • Easier maintenance
  • Faster iteration

This was a textbook example of how to modernize a legacy no-code chatbot into a lightweight AI-first application.

Future Plans for this French Vocab Tool

Current roadmap experiments include:

  • Skill-based difficulty progression
  • Theme-aware vocab clusters (e.g. travel, food, business)
  • Gamification elements like streaks and leaderboard stats

What excites me most is the ability to apply this model to other language learning use cases—or any niche where lightweight education tools can benefit from conversational UX.

I’ll be documenting best practices and sharing templates for others interested in building apps with ChatGPT. 

Read: My Boring AF Topic Cluster Tool Name