Case Study

GainPath

This is the latest architecture story from HealthLog docs, organized as why the design choices were made, what the product currently delivers, and how each request flows through deterministic services and bounded AI.

The production app is intentionally gated. This page is the public overview, with screenshots and architectural context. If you would like walkthrough access, contact me and I can share a guided demo.

Preview

Because the live product is gated, these screenshots give a small window into the interface while keeping this page focused on architecture decisions and system behavior.

GainPath dashboard showing daily calorie and macro gauges.

Product Preview

Selected screen: Dashboard

The application itself is gated, so these screenshots give a sense of the product surface without exposing live user flows. The emphasis is still on the architecture and product decisions behind the system.

What It Does

GainPath unifies logging, workouts, reporting, profile memory, and coaching into one health platform rather than separate tools.

What users get

  • Unified product surfaces: Tracker, Workout, Profile, plus Food, Reporting, and Day views.
  • A single health system where workout completion writes bridged activity events into shared energy totals.
  • A mobile-first experience that still works when AI paths are unavailable or intentionally skipped.

Core behavior

  • Chat orchestration persists structured records, recalculates totals, compares goals, and returns coaching plus missing-data prompts.
  • Deterministic macro and calorie gauge services evaluate pace, daily status, and rolling 7-day patterns.
  • Workout and food histories remain intentionally separate while feeding one coherent daily energy model.

AI augmentation

  • One-call coach insight runs after deterministic recomputation and returns concise structured guidance.
  • Food parsing supports item-level classification and tier-aware follow-up without forcing automatic lookup behavior.
  • The database remains the source of truth, with AI supporting interpretation and language rather than replacing domain logic.

Operating model

  • Vercel + Cloud Run + Neon gives a practical production stack with low operational overhead.
  • Docker Compose supports full local iteration and reproducible environment setup.
  • Import endpoints and docs exposure are explicitly gated by environment to reduce production risk.

Why It Is Built This Way

These choices prioritize reliability, bounded risk, and long-term iteration speed over novelty.

Local-first product memory

The architecture uses Neon Postgres with local-first storage paths so the product can preserve user state reliably while staying responsive across web and future mobile clients. This design keeps durable memory at the center and avoids fragile, UI-only state patterns.

Feature isolation for faster iteration

The frontend is organized into route groups and feature-owned modules so changes stay localized instead of triggering broad rebuild risk. This reduces coupling across food, workout, reporting, and profile surfaces and keeps velocity high as the product grows.

Deterministic core with bounded AI

Totals, gauge status, and daily health logic run through deterministic services first. AI is used to interpret flexible input and phrase coaching output after structured calculations are complete, so essential behavior remains explainable, testable, and cost-aware.

Security and control boundaries

The app-facing boundary uses BFF patterns so backend credentials and verification stay server-side. Non-production import routes, environment-gated docs, and explicit auth boundaries keep operational risk lower while preserving development flexibility.

Guardrails for AI lookup

Food lookup is user-triggered, tier-aware, and staged before promotion into canonical food memory. Per-user rate limiting and anti-loop controls protect reliability and cost while still allowing AI-assisted enrichment when users choose it.

How The System Works

The request path is intentionally layered so access, orchestration, deterministic logic, data memory, and AI augmentation each have clear responsibilities.

1. Access and session control

User access is managed at the app boundary, with session handling separated from backend credentials and protected API concerns.

2. App-facing API boundary

The Next.js layer acts as the BFF boundary for browser traffic and backend communication, keeping token and contract handling server-side.

3. Backend orchestration

FastAPI classifies input, extracts structured facts, persists events, recalculates totals, compares goals, and prepares coaching responses in a consistent orchestration loop.

4. Deterministic domain services

Macro gauges, calorie pacing, and workout-energy bridging are computed deterministically so key outcomes remain reliable and auditable.

5. Persistence and memory

Neon Postgres stores durable health records and agent memory, while local-first storage paths improve resilience and product responsiveness across clients.

6. Optional AI assistance

AI is invoked as an augmentation layer for interpretation and phrasing, with staged writes and per-user rate limits on lookup-heavy paths.

Architecture Stack

Each layer was selected to keep the product understandable, testable, and practical to operate.

Frontend

Next.js App Router delivers the user experience and acts as the app-facing boundary so UI evolution stays decoupled from backend domain logic.

Backend API

FastAPI on Cloud Run owns orchestration, validation, deterministic services, and API contracts for current and future clients.

Data model

Neon Postgres stores food events, activity events, profiles, goals, workouts, and daily summaries as the source of truth for health state.

Auth boundary

Session and backend access concerns are isolated behind the app boundary so browser-facing code has a narrower security role.

AI boundary

AI enriches parsing and coaching output, while deterministic services and persisted records remain the authority for totals, rules, and continuity.

Entitlement boundary

Tier-aware AI flows and staged promotion patterns support future product packaging without forcing a redesign of core domain services.

Design Outcomes

A concise summary of why, what, and how outcomes from the current architecture.

  • Why: clear boundaries reduce risk, improve maintainability, and keep AI behavior governable.
  • What: food, workout, reporting, profile, and daily summary flows operate as one coherent health system.
  • How: deterministic services run first, then constrained AI layers add interpretation and coaching language.
  • Local-first storage and durable database memory improve resilience without sacrificing product velocity.
  • Feature-isolated frontend structure keeps iteration fast as capabilities expand.