OVERVIEW
Upload a photo and press IS IT WOKE? to kick off a playful, AI‑assisted reading of the image. The app hides the preview behind a pixelation veil while it thinks, then reveals a tight, two‑sentence verdict that finds an esoteric connection to Latter‑day Saint history—no login required.
USER EXPERIENCE
- Upload & Spin: Drag‑and‑drop or select a photo, then tap the spinner CTA to begin analysis; the preview is pixelated while processing.
- Multi‑Step Reveal: After analysis, a staged flow gates the result: first “WHY IS IT WOKE?”, then “HOW DO I ‘DO THE WORK?’”—a light, comedic rhythm that guides the user.
- Concise Verdict: The AI’s response is constrained to exactly two sentences, keeping outcomes crisp, quotable, and on‑brand.
- No‑Friction Sessions: Visitors are tracked with a localStorage session ID for a seamless, anonymous experience.
HOW IT WORKS
- FRONTEND: React 19 + TypeScript + Vite, styled with Tailwind. UI primitives include a pixelation overlay and spinner‑based CTA.
- BACKEND: Convex handles file storage and data access. On upload, a record is created and a scheduled action runs ~2s later to perform analysis.
- AI: OpenAI Vision (via gpt‑4o chat‑completions) receives a base64 data URI of the image plus a strict prompt requiring exactly two sentences under ~70 tokens.
- OBSERVABILITY: A small structured logging helper emits lifecycle events (upload, schedule, OpenAI request, save, and error) for traceability.
- DATA MODEL: In addition to photos, the backend includes curated tables (music, films, TV, fiction, non‑fiction, podcasts, architecture, visual art) that can power playful, mad‑lib style suggestions.
WHAT MAKES IT DIFFERENT
- Constraint‑as‑Style: A hard two‑sentence rule turns LLM output into punchy, repeatable micro‑criticism—less rambling, more delight.
- Staged Dramaturgy: The pixelated suspense and two‑step reveal (“WHY?” then “HOW?”) make the punchline land reliably.
- Zero‑Friction Onboarding: No auth, instant sessions—users are in the experience within seconds.
- Serverless Orchestration: Convex scheduler sequences storage → vision analysis → persistence with robust error mapping (timeouts, rate limits, quotas).
- Cultural Texture: A dedicated Mormon culture dataset foundation supports optional comedic “media homework” riffs.
MY DESIGN APPROACH
- Concept, interaction design, tone, and micro‑copy for the reveal flow
- Frontend engineering (React), pixelation and spinner UX, staged UI
- Backend functions, real‑time data model, scheduled actions, and prompt engineering with strict constraints
- Observability & reliability: structured logs, defensive error handling, and safe image handling
OUTCOME
A shareable, tongue‑in‑cheek computer‑vision experiment that’s fast, anonymous, and production‑friendly—deployed to wokeornotwoke.org and wokeornotwoke.com, with room for extensions (mad‑lib suggestions, galleries, analytics).
TECH STACK
- React 19, TypeScript, Vite, Tailwind CSS
- Convex (Real‑Time DB / Storage / Serverless Functions / Scheduler / Anonymous Sessions)
- OpenAI Vision via gpt‑4o chat‑completions (image + text)
- Cloudflare Pages (frontend deployment)