Builds and Breakthroughs #3
From Prompt to Done: How AI Apps are collapsing entire workflows into one click
This week’s Builds & Breakthroughs is a snapshot of where “AI products” are really headed: from single-shot generators to full-blown workflow engines.
Three very different apps, Chronicle (slides), Opennote (tutoring), and Saffie (meal-planning), share the same playbook:
One prompt, whole job. Each product collapses a multi-step chore into a single action:
Chronicle → “Paste → Remix → Deck”
Opennote → “Ask → Whiteboard + Graph + Quiz”
Saffie → “TikTok link → Recipe → Instacart cart”
Loop-closing integrations. The agent doesn’t stop at insight; it finishes the job.
Chronicle’s multiplayer decks kill .pptx chains.
Saffie pushes the grocery list straight into Instacart.
Opennote issues real-time study nudges based on stored progress.
Multimodal in one shot. GPT-4o, Claude 4, and Gemini 2.5 windows eat text, images, and even video frames. Skip chunking: pass the raw page or TikTok clip straight into the model.
Personalization that compounds. Memory profiles, dietary prefs, slide-style remixes—the more you use the product, the sharper it gets. Store a few KB of user data, query it on every call, and watch retention tick up.
Model swapability for cost control. Each team keeps a pluggable model layer so they can ride the price/performance curve (Claude today, GPT-4 tomorrow, LLaMA the day after). Wrap LLM calls behind a router so you can chase cost/quality curves without a refactor.
Speed & guardrails still hurt. Deck images misfire, tool chains drop context, recipes take 10s. Budget sprints for latency tuning and edge-case eval; nobody’s solved this yet.
Dive in for the builds, the stacks, and the pain points.
Build: Chronicle – Cursor for decks
Builders: Mayuresh Patole, Tejas Gawande
Chronicle is the fastest way to go from idea to presentation. Paste a URL, add a few bullets, or drop in rough notes — and Chronicle turns it into a beautifully designed, fully editable deck. Think of it as Cursor for slides: where AI handles the heavy lifting, and you focus on the story.
Breakthroughs
Technical or conceptual advances
Remix: Chronicle’s core AI rewrite engine turns raw thoughts into slide-ready content in a single pass.
“The fundamental AI move in Chronicle ('Remix') is enabled by the ability to use LLMs to rewrite content.”
Web-to-deck: Paste any URL. Chronicle distills the entire page into a structured, scannable presentation.
Figma-style collaboration: Built on Liveblocks, Chronicle supports real-time multiplayer with edits and comments synced automatically.
Layout-as-output: Chronicle doesn’t just write text — it generates structured, spatial layouts using design heuristics baked into the model prompts. “AI writes for the canvas, not just for the page.”
Interactivity-aware content: Chronicle auto-generates not just slides, but interactive structure — embedding links, attention guides, and dynamic widgets as part of the default output.
Unlocks
What those breakthroughs make possible
One-click Remix: Turn any rough draft into a polished visual narrative — instantly.
URL-to-deck: Summarize long web content into a shareable, presentation-friendly format.
Multiplayer by default: No versioning headaches. No .pptx email chains.
Model-swap safety net: Chronicle’s model-agnostic stack means quality and cost are tunable over time.
Stack
What’s under the hood
Models: Claude 3.7 today — easily swappable with GPT-4, Gemini, or future systems.
Frontend: Next.js + shadcn/ui.
Collaboration layer: Liveblocks for realtime editing and comments.
AI routing: Vercel AI SDK for flexible, future-proof inference.
Still hard
What the team is working on
Fast, tasteful image generation: Chronicle is layering in OpenAI’s image generation APIs to create relevant, on-brand visuals on the fly.
Taste guardrails: Fonts, colors, and spacing are 90% there — but ensuring nothing ever looks bad remains the real design challenge.
“It’s easy to make slides fast. It’s harder to make them beautiful. But it’s extremely hard to ensure they’re never off-brand, awkward, or gimmicky.”
Why it matters
The big picture shifts
Slides move from hours of drag-and-drop to minutes of prompt-and-polish.
“Taste” becomes a design API, not a fixed template.
Model-agnostic design keeps quality flexible and cost-controllable.
Collaboration is built-in, not bolted on — multiplayer comes standard.
Try it
Your story, your slides — no design skills required.
Build your first Chronicle deck
Build: Opennote – An AI-tutor that learns you
Builders: Abhi Arya, Rishi Srihari, Vedant Vyas
An interactive AI workspace that teaches and learns you. Feynman, the agentic tutor, draws on whiteboards, plots graphs, generates flashcards, and fires practice drills, all tuned by a growing memory of each student’s work.
“We’re building an interactive AI tutor that truly knows you and the tools that adapt to students over time to create a proactive, customized learning environment.”
Breakthroughs
Technical or conceptual advances
Backend → tool router: Core lesson logic moved server-side; Feynman now chains whiteboard, graph, video and quiz tools with far fewer failures.
Dedicated memory service: Per-user vector store lets Feynman pull past answers and push real-time, personalized nudges.
Cheap context → in-prompt history: LLaMA 4 and Claude 4 can hold hundreds of MB of student history right in the prompt.
A tutor turn ranges from 300–1,200 tokens; when learners upload files, they trim context to only the most relevant passages to cut token costs.
A full session is typically ≈ 10k tokens. If it ever pushes beyond 128 k, they prune older context based on memory so the chat can keep running—no external retrieval layer needed in most cases
Stack
What’s under the hood
Models: LLaMA 4 (reasoning), LLaMA 3.1 (fine-tuning), Claude 4 (agent behind DeepTutor Interactive Deep Research Assistant)
Frontend: Next.js + Vercel
Backend: Python/FastAPI on AWS
Memory: Custom store (vector + metadata) injected per session
Still hard
What could be better
Tool-call reliability – Agents sometimes fumble when chaining whiteboard, graph, or quiz tools, especially under unusual query styles.
Model guardrails – Handling the wide variety of student phrasing still needs stronger safeguards and context-chaining logic to keep answers on track.
Unlocks
What those breakthroughs make possible
Memory-first tutoring – Students start with just a few KB of memory after onboarding; as they complete lessons that profile grows to hundreds of MB, letting Feynman tailor prompts and nudges to each learner.
Agent-driven multimodal flows – The backend router chains whiteboard, graph, video, and quiz tools in one plan, so Feynman can draw, plot, question, and explain without manual glue.
Fast feature turns – Because tool routing lives server-side, the team can drop a new teaching tool into the router with almost no front-end changes.
Try it
Start learning here
Build: Saffie – Your AI meal planning assistant
Builder: Vlad Barash
Saffie generates fully-personalized meal plans, extracts structured recipes straight from TikTok or Instagram videos, turns everything into a smart grocery list, and sends the order to Instacart in one tap.
Breakthroughs
Technical or conceptual advances
Agent-orchestration SDKs: Recent OpenAI Agents SDK lets Saffie chain recipe generation, media scraping, grocery list creation, and Instacart checkout in a single flow.
Stronger multimodal models: GPT-4.1, Gemini 2.5 Pro, Claude 4, and Flux 1.1 Pro handle text + video frames for reliable “recipe from social clip” extraction.
Stack
What’s under the hood
Backend: OpenAI Agents SDK
LLMs: GPT-4.1, Gemini 2.5 Pro, Claude 4
Image gen: Flux 1.1 Pro
Integration: Instacart API for grocery ordering
Still hard
What could be better
Recipe + image generation averages ~10 s—faster model latency would tighten the loop.
Chat works, but the team is exploring more intuitive, non-chat UX better suited for meal planning.
Unlocks
What those breakthroughs make possible
Social-to-recipe pipeline – Any TikTok or Instagram food video can become a structured recipe in one step.
Instant grocery cart – The grocery list pipes straight into Instacart, so everything is ready to check out.
Truly personal menus – Recipes adapt to individual tastes and dietary preferences, not generic meal plans.
End-to-end agent flow – One agent orchestrates recipe generation, media extraction, list building, and ordering—no manual hops in between.
Try it
Download the app here. First 25 people who email hello@saffie.ai, get a free month in exchange for a 15-min feedback call!
love these editions! keep publishing
also this was great copy -> “Taste” becomes a design API, not a fixed template.