Use GPT-5 with CFG for effect generation
Context
The original architecture planned three layers: natural language → intents (via Apple Foundation Models) → effects (via rule interpreter) → trip modifications.
This failed for two reasons. First, Foundation Models didn't meet our accuracy requirements for intent extraction at the time of evaluation. Second, trip mutations need trip context — "make it two weeks" requires knowing the current duration, "add Prague" requires understanding existing destinations. We needed both reliable language understanding AND context-awareness, which meant switching to a cloud model.
Decision
Use GPT-5 with Context-Free Grammar (CFG) constraints to generate effects directly from natural language, bypassing intent extraction entirely.
- Natural language + current trip context → GPT-5 with CFG constraints
- GPT-5 generates structured effects guaranteed valid by the grammar
- Effects applied deterministically to produce new trip state
This works uniformly for both creation ("Vienna birthday weekend") and modification ("make it two weeks").
Consequences
Context-aware generation — GPT-5 considers the full trip state when interpreting modifications. Simpler architecture — removes the intent layer and rule interpreter entirely. CFG constraints guarantee structurally valid output.
The cost is an external API dependency (with latency and per-request charges), and less explainability compared to an explicit intent → rule → effect pipeline.