Case Study

GainPath

This case study reflects the current GainPath docs: a mobile-first AI health coach with unified tracking surfaces, deterministic health logic, local-first performance layers, and bounded AI where it adds the most value.

The production app is intentionally gated. This page is the public overview, with screenshots and architectural context. If you would like walkthrough access, contact me and I can share a guided demo.

Preview

Because the live product is gated, these screenshots give a small window into the interface while keeping this page focused on architecture decisions and system behavior.

GainPath dashboard showing daily calorie and macro gauges.

Product Preview

Selected screen: Dashboard

The application itself is gated, so these screenshots give a sense of the product surface without exposing live user flows. The emphasis is still on the architecture and product decisions behind the system.

What It Does

GainPath unifies logging, workouts, reporting, profile memory, and coaching into one health platform rather than separate tools.

What users get

  • Unified product surfaces: Tracker, Workout, and Profile, plus Reporting, Day detail, and legacy helper routes.
  • A single health system where completed workouts write bridged activity events into shared daily energy totals.
  • A mobile-first experience with client-side food parsing, optimistic updates, and graceful fallback when AI is unavailable or skipped.

Core behavior

  • Chat orchestration persists structured records, recalculates totals, compares goals, and returns coaching plus missing-data prompts.
  • Deterministic macro and calorie gauge services evaluate pace, sweet-spot status, daily energy balance, and rolling 7-day patterns.
  • Workout and food histories remain intentionally separate while feeding one coherent daily energy model and reporting layer.

AI augmentation

  • One-call coach insight runs after deterministic recomputation and returns concise structured guidance.
  • Food parsing starts with a client catalog bundle and deterministic matching before user-triggered, tier-aware AI lookup is offered.
  • The database remains the source of truth, with AI supporting interpretation and language rather than replacing domain logic.

Operating model

  • Vercel + Cloud Run + Neon gives a practical production stack with low operational overhead.
  • Docker Compose supports full local iteration and reproducible environment setup.
  • Import endpoints and docs exposure are explicitly gated by environment to reduce production risk.

Why It Is Built This Way

These choices prioritize reliability, bounded risk, and long-term iteration speed over novelty.

Local-first product memory

The architecture uses Neon Postgres with local-first storage paths so the product can preserve user state reliably while staying responsive across web and future mobile clients. This design keeps durable memory at the center and avoids fragile, UI-only state patterns.

Feature isolation for faster iteration

The frontend is organized into route groups and feature-owned modules so changes stay localized instead of triggering broad rebuild risk. This reduces coupling across food, workout, reporting, and profile surfaces and keeps velocity high as the product grows.

Deterministic core with bounded AI

Totals, gauge status, and daily health logic run through deterministic services first. AI is used to interpret flexible input and phrase coaching output after structured calculations are complete, so essential behavior remains explainable, testable, and cost-aware.

Security and control boundaries

The app-facing boundary uses BFF patterns so backend credentials and verification stay server-side. Non-production import routes, environment-gated docs, and explicit auth boundaries keep operational risk lower while preserving development flexibility.

Guardrails for AI lookup

Food lookup is user-triggered, tier-aware, and staged before promotion into canonical food memory. Per-user rate limiting and anti-loop controls protect reliability and cost while still allowing AI-assisted enrichment when users choose it.

How The System Works

The request path is intentionally layered so access, orchestration, deterministic logic, data memory, and AI augmentation each have clear responsibilities.

1. Access and session control

User access is managed at the app boundary, with session handling separated from backend credentials and protected API concerns.

2. App-facing API boundary

The Next.js layer acts as the BFF boundary for browser traffic and backend communication, keeping token and contract handling server-side.

3. Backend orchestration

FastAPI classifies input, extracts structured facts, persists events, recalculates totals, compares goals, and prepares coaching responses in a consistent orchestration loop.

4. Deterministic domain services

Macro gauges, calorie pacing, and workout-energy bridging are computed deterministically so key outcomes remain reliable and auditable.

5. Persistence and memory

Neon Postgres stores durable health records and agent memory, while IndexedDB-backed local-first storage improves responsiveness today and maps cleanly to planned mobile SQLite paths.

6. Optional AI assistance

AI is invoked as an augmentation layer for coaching phrasing and unresolved food estimation, with staged writes and per-user rate limits on lookup-heavy paths.

Architecture Stack

Each layer was selected to keep the product understandable, testable, and practical to operate.

System ArchitectureFRONTENDNext.js App RouterClient-side food parser • Session handling • App-facing boundaryBACKEND APIFastAPI on Cloud RunOrchestration • Validation • Deterministic servicesDATA MODELNeon Postgres + IndexedDBDurable health records • Local-first performance layer • Source of truthBOUNDARIESAUTH BOUNDARYSession IsolationBackend credentials stay privateAI BOUNDARYAugmentation LayerCoaching & interpretation onlyENTITLEMENTTier-aware ControlFuture product packagingKey Patterns:• BFF (Backend for Frontend) isolates credentials• Deterministic services run first, AI augments interpretation• Database is the source of truth for all state

Frontend

Next.js App Router delivers the user experience, runs the client-side food parser against a catalog bundle, and acts as the app-facing boundary so UI evolution stays decoupled from backend domain logic.

Backend API

FastAPI on Cloud Run owns orchestration, validation, deterministic services, and API contracts for current and future clients.

Data model

Neon Postgres stores food events, activity events, profiles, goals, workouts, and daily summaries as the source of truth for health state, with local-first storage serving as a performance layer.

Auth boundary

Session and backend access concerns are isolated behind the app boundary so browser-facing code has a narrower security role.

AI boundary

AI enriches coaching output and unresolved food lookup, while deterministic services and persisted records remain the authority for totals, rules, and continuity.

Entitlement boundary

Tier-aware AI flows and staged promotion patterns support future product packaging without forcing a redesign of core domain services.

Design Outcomes

A concise summary of why, what, and how outcomes from the current architecture.

  • Why: clear boundaries reduce risk, improve maintainability, and keep AI behavior governable.
  • What: tracker, workout, reporting, profile, and day-detail flows operate as one coherent health system.
  • How: client-side parsing and deterministic services run first, then constrained AI layers add interpretation and coaching language.
  • Local-first performance layers and durable database memory improve resilience without sacrificing product velocity.
  • Feature-isolated frontend structure keeps iteration fast as capabilities expand.