An AI-powered restaurant finder built with open-source components:
FastAPI+LangGraphfor orchestrationChainlitfor chat UISearchAPIfor restaurant data retrievalSQLitefor conversation memoryDocker Composefor deploymentOpenTelemetry+Jaeger+Prometheus+Grafanafor observability
| Component | Technology | Purpose |
|---|---|---|
| API runtime | FastAPI | Streaming /invocations + memory snapshot /memory endpoints |
| Orchestration | LangGraph | Router + search agent + tool loop |
| Models | OpenAI-compatible endpoint (default Groq) | Router/extraction/response generation |
| Search tool | SearchAPI HTTP | Structured restaurant results |
| Memory | SQLite (local) | Conversation facts/summaries |
| UI | Chainlit | Web chat interface |
| Infra | Docker Compose | Portable self-hosted deployment |
| Observability | OTEL + Jaeger + Prometheus + Grafana | Tracing and monitoring |
.
├── restaurant-finder-api/
│ ├── src/
│ │ ├── application/orchestrator/
│ │ ├── domain/
│ │ ├── evaluation/
│ │ └── infrastructure/
│ ├── tests/
│ ├── Dockerfile
│ ├── Makefile
│ └── pyproject.toml
├── restaurant-finder-ui/
│ ├── app.py
│ ├── .env.example
│ └── pyproject.toml
└── restaurant-finder-infra/
├── docker-compose.yml
├── prometheus/prometheus.yml
└── grafana/provisioning/
- Python 3.11+
uv- Docker + Docker Compose
- Search API key
- OpenAI-compatible model API key (default
GROQ_API_KEY)
- Configure environment files:
cp restaurant-finder-api/.env.example restaurant-finder-api/.env
cp restaurant-finder-ui/.env.example restaurant-finder-ui/.env- Set required values in
restaurant-finder-api/.env:
GROQ_API_KEYSEARCH_API_KEY
- Start full stack:
cd restaurant-finder-infra
docker compose up --build- Open services:
- UI:
http://localhost:8000 - API health:
http://localhost:8080/health - Jaeger:
http://localhost:16686 - Prometheus:
http://localhost:9090 - Grafana:
http://localhost:3000
cd restaurant-finder-api
uv sync
uv run python -m src.mainAPI runs on http://localhost:8080.
cd restaurant-finder-ui
uv sync
chainlit run app.pyIn the UI, click View Memory to inspect saved preferences, facts, and summaries
for the current customer_name + conversation_id in a side panel.
| Variable | Required | Description |
|---|---|---|
GROQ_API_KEY |
Yes | Model API key |
GROQ_API_BASE_URL |
No | OpenAI-compatible API base URL |
SEARCH_API_KEY |
Yes | SearchAPI key |
SEARCH_API_BASE_URL |
No | Search API endpoint |
SEARCH_API_TIMEOUT_SECONDS |
No | Search HTTP timeout |
SEARCH_MAX_IN_FLIGHT_REQUESTS |
No | Max concurrent outbound search calls |
SEARCH_CACHE_ENABLED |
No | Enable in-process search cache |
ENABLE_BROWSER_TOOLS |
No | Enable optional browser exploration tools |
GUARDRAIL_ENABLED |
No | Enable local guardrails |
MEMORY_DB_PATH |
No | SQLite memory DB path |
MEMORY_PRELOAD_TOP_K |
No | Memory entries per category injected into context |
MEMORY_PRELOAD_MAX_CHARS |
No | Max memory context size per request |
AGENT_OBSERVABILITY_ENABLED |
No | Enable OTEL spans |
OTEL_EXPORTER_OTLP_ENDPOINT |
No | OTLP endpoint (e.g. Jaeger collector) |
AGENT_API_URL |
No | Local API URL for evaluation utilities |
API_WORKERS |
No | Number of API worker processes |
API_LIMIT_CONCURRENCY |
No | Max concurrent API requests |
API_MAX_PROMPT_CHARS |
No | Request prompt length cap |
API_STREAM_HEARTBEAT_SECONDS |
No | SSE keepalive heartbeat interval |
STARTUP_WARMUP_MODELS |
No | Warm model clients on startup |
STARTUP_WARMUP_GRAPH |
No | Precompile orchestration graph on startup |
MODEL_TIMEOUT_SECONDS |
No | LLM request timeout |
MODEL_MAX_RETRIES |
No | LLM request retry count |
| Variable | Required | Description |
|---|---|---|
AGENT_API_URL |
No | API endpoint for chat invocations |
GET /memory- Query params:
customer_name(required)conversation_id(required)limit_per_category(optional, default10, max50)
- Returns category buckets and totals for
preferences,facts, andsummariesacross current-session + same-user scope.
The evaluation framework is OSS/local and can run against the local API endpoint.
cd restaurant-finder-api
make eval
make eval-categories CATEGORIES="basic_search dietary_search"
make eval-safetydeploy-image.yml: runs API tests, then builds and publishes API image to GHCRdeploy-infra.yml: validates Docker Compose and API image builddestroy-infra.yml: teardown guidance for local docker stack
cd restaurant-finder-infra
docker compose down -v