04
Product
16
Backend
09
Auth
12
iOS
07
Infra
02
Real-Time

Use GPT-5 with CFG for effect generation

ADR-0032 ACCEPTED · 2025-09-03
Use GPT-5 with CFG for direct effect generation instead of intent extraction

Context

The original architecture planned three layers: natural language → intents (via Apple Foundation Models) → effects (via rule interpreter) → trip modifications.

This failed for two reasons. First, Foundation Models didn't meet our accuracy requirements for intent extraction at the time of evaluation. Second, trip mutations need trip context — "make it two weeks" requires knowing the current duration, "add Prague" requires understanding existing destinations. We needed both reliable language understanding AND context-awareness, which meant switching to a cloud model.

Decision

Use GPT-5 with Context-Free Grammar (CFG) constraints to generate effects directly from natural language, bypassing intent extraction entirely.

  1. Natural language + current trip context → GPT-5 with CFG constraints
  2. GPT-5 generates structured effects guaranteed valid by the grammar
  3. Effects applied deterministically to produce new trip state

This works uniformly for both creation ("Vienna birthday weekend") and modification ("make it two weeks").

Consequences

Context-aware generation — GPT-5 considers the full trip state when interpreting modifications. Simpler architecture — removes the intent layer and rule interpreter entirely. CFG constraints guarantee structurally valid output.

The cost is an external API dependency (with latency and per-request charges), and less explainability compared to an explicit intent → rule → effect pipeline.