AI-augmented
project management
Decompose an epic into estimated, assigned sub-issues in real-time. Three LangGraph workers. Structured LLM output. Zero magic strings.
Stack
What makes it non-trivial
LangGraph Supervisor
Decomposer → Estimator → Assigner pipeline. TypedDict state, conditional edges, AsyncPostgresSaver checkpointing — partial runs survive failures.
Real-time SSE Streaming
astream_events yields per-worker proposals. The UI updates incrementally — users don't wait for the full pipeline to finish.
Structured LLM Output
ChatAnthropic.with_structured_output() enforces Pydantic schemas on every agent response. No JSON post-processing, no hallucinated shapes.
Self-referential Data Model
Issues have parent_id → issues.id (nullable). Epics are null-parent issues. The decomposition feature is additive — zero schema changes.
CockroachDB + Redis
UUID PKs avoid distributed write hot-spots. REGIONAL BY ROW ready for multi-region with no migration changes. Redis for session and cache.
Observability First
structlog JSON logs with consistent fields (run_id, worker, latency_ms, tokens). OpenTelemetry spans on every request. Multi-stage non-root Docker.
Explore the Technical Panel
Timed Q&A, LLM grading with model hints and panel follow-ups.