Skip to main contentWhy instrument with Brixo?
- Frictionless integration: send raw OTEL spans or point your OpenAI calls to our proxy.
- Readable step timelines: we infer steps automatically; optionally add semantic spans via Traceloop/OpenLLMetry.
- Business‑grade analytics: tasks, steps, outcomes, sentiment, cost, latency and more. Check out our Demo Agent Analytics Dashboard.
Choose your path
- Zero‑code (HTTP) · Use the OpenAI Proxy for files/batches/requests. → /http/openai-proxy
- Auto‑instrument (OTEL) · Keep your SDK; emit OpenAI/HTTP/tool spans. → /ingestion/overview
- Optional semantic steps · Add Traceloop/OpenLLMetry to label workflows/tasks. → /ingestion/frameworks
You can combine 2 + 3: raw spans for depth, semantic steps for a readable story.