Docker Integration Test Strategy
This document covers the Docker-backed integration suite for the full run lifecycle inside the containerized stack.
Objectives
- prove API, worker, beat, PostgreSQL, Redis, and the local OpenAI-compatible test gateway cooperate correctly
- validate run creation, execution, scoring, retries, and SSE updates
- keep fixtures deterministic and reusable in CI and local debugging
Quick Start
The helper:
- creates or reuses
.env.integration - starts
postgres,redis, the integration-only local OpenAI-compatible gateway,celery-worker, and optionalcelery-beat - runs migrations
- seeds deterministic provider and dataset fixtures
- executes
pytest -m integration - collects logs and coverage under
.artifacts/docker-integration/
Topology
postgresredis- local OpenAI-compatible test gateway from
docker-compose.integration.yml(compose service:fake-openai) celery-worker- optional
celery-beat - backend test container
Scenario Matrix
Test Gateway Notes
Integration runs do not depend on external API keys.
Instead, the stack seeds a local OpenAI-compatible test provider. The app still uses the normal LiteLLM client path; only the provider endpoint is local and deterministic.
Model-name conventions:
gpt-3.5-turbosucceeds*-fail-once-*fails once, then succeeds*-always-fail-*fails every time
Local Developer Workflow
Use docker compose logs backend, docker compose logs celery-worker, and the artifact directory
when debugging failures.
