
Case study: ExamFlow — AI-powered study platform built from scratch
How we built ExamFlow, a platform that turns notes into AI-generated exams, summaries, and flashcards. Architecture, AI pipeline, technical challenges, and results.
From an ambitious idea to a real product in production
When the ExamFlow team reached out, they had a clear vision: a platform where students upload their notes and AI generates personalized exams, summaries, and flashcards. Not a generic chatbot that answers questions from the internet — a tool that studies your material and tests you on your topics.
The challenge was massive. Nothing like this existed in the market that combined OCR, natural language processing, AI content generation, and a smooth user experience. We had to build everything from scratch.
What we built
ExamFlow is a complete AI-powered study platform covering everything from elementary school to professional certification exams. Here's what it does:
- Smart digitization: upload a PDF, Word doc, or photo of handwritten notes. Google Vision OCR converts it to text with high accuracy — even handwriting.
- Automatic topic detection: AI identifies chapters, indexes, and section headings. Organizes all content automatically — users can manually adjust if needed.
- Study material generation: summaries, executive summaries, hierarchical outlines, flashcards with spaced repetition (SM-2 algorithm).
- AI-generated exams: multiple choice, true/false, short answer, and essay questions. All generated from the student's actual material, not generic sources.
- Adaptive learning: the system remembers which questions you got wrong and repeats them in future exams.
- Oral recitation: present a topic out loud and AI evaluates your exposition — ideal for certification exams.
- Gamification: badges, achievements, and motivational messages adapted to the student's age.
The technical complexity under the hood
This isn't a typical "pretty website" project. The architecture has several layers of complexity worth diving into.
Document processing pipeline
When a user uploads a document, here's what happens behind the scenes:
- PDF conversion — Word docs and photos are first normalized to PDF
- OCR with Google Vision — extracts text with intelligent error correction
- Automatic topic detection — AI analyzes text looking for index patterns or chapter headings
- Smart chunking — text is divided into semantically coherent fragments
- Vector embeddings — each chunk is converted to a vector with OpenAI Embeddings and stored in pgvector
All of this runs asynchronously with BullMQ (Redis-backed job queues), so users don't have to stare at a progress bar for minutes.
RAG for exam generation
Exams aren't generated "from nothing." They use Retrieval-Augmented Generation (RAG):
- The student selects which topics to be tested on
- The most relevant chunks are retrieved from the vector database (pgvector)
- Those chunks are sent as context to Claude (Anthropic)
- The AI generates questions based exclusively on the student's actual material
This ensures questions are relevant and accurate — never generic or hallucinated.
Tech stack
The project is a pnpm monorepo that includes:
- Frontend: Next.js 15 with App Router, TypeScript, Tailwind CSS, Zustand, TanStack Query, next-intl (bilingual ES/EN)
- Backend: Node.js + Fastify + Prisma + BullMQ
- Database: PostgreSQL with pgvector (Supabase) + Redis (Upstash)
- AI: Claude (Anthropic) for content generation, OpenAI for embeddings, Google Vision for OCR
- Storage: Supabase Storage for original documents
- Auth: Supabase Auth with JWT sessions
- Deploy: Railway (web + API) + Cloudflare CDN
- Testing: Vitest with 749+ tests (unit + integration)
Adaptation by education level
One of the most interesting challenges was adapting the experience based on the student's age:
- Elementary: simple language, strong visual gamification, straightforward questions
- University/Professional exams: academic rigor, long-form essay questions, oral recitation with AI evaluation
The interface adapts: colors, typography, motivational message tone, and question difficulty. All controlled by the selected education level.
Results
ExamFlow is live in production at examflow.app with real users. Some key metrics:
- 749+ automated tests passing on every deploy
- Processing pipeline handling 200+ page documents without issues
- Exam generation time: under 10 seconds
- SEO optimized: structured data, dynamic sitemap, SSG with ISR, hreflang ES/EN
- Core Web Vitals: green scores across all metrics
What we learned
Projects like ExamFlow prove that AI isn't just "slapping a chatbot on a website." It's designing robust processing pipelines, managing async job queues, handling external service failures gracefully, and above all — making complex technology feel simple for the user.
The student uploads a PDF and gets an exam. That simple. The fact that there's OCR, vectorization, RAG, and language models running underneath is our problem, not theirs.
Have an ambitious project that combines web + AI? Tell us about your idea — we build things others don't dare to attempt.