Skip to main content
Goal: In under 10 minutes, send rich agent traces from your Python LangChain/LangGraph app to Brixo using both OpenLLMetry (Traceloop) and OpenInference.

⚡️ TL;DR

1. Set env vars

export OTEL_EXPORTER_OTLP_TRACES_HEADERS="Brixo-Auth=Bearer <BRIXO_API_KEY>"
export OTEL_EXPORTER_OTLP_ENDPOINT="https://otel-proxy.brixo.com:4318"

2. Install required SDKs

pip install opentelemetry-sdk opentelemetry-exporter-otlp traceloop-sdk openinference-instrumentation-langchain

3. Run your agent

python my_agent.py
That’s it — your traces are flowing to Brixo

🧩 Prerequisites

  • Python 3.9+
  • Brixo account + API key
  • Internet access from your app to Brixo ingest (OTLP/HTTP)
Export environment variables:
export OTEL_EXPORTER_OTLP_TRACES_HEADERS="Brixo-Auth=Bearer <BRIXO_API_KEY>"
export OTEL_EXPORTER_OTLP_ENDPOINT="https://otel.brixo.com:4318"
Brixo accepts OTLP over HTTP. Authentication is via the Brixo-Auth header.

🧱 Step 1 — Install dependencies

# Core OpenTelemetry
pip install opentelemetry-sdk opentelemetry-exporter-otlp

# OpenLLMetry (Traceloop)
pip install traceloop-sdk

# OpenInference instrumentation for LangChain
pip install openinference-instrumentation-langchain

🧠 Step 2 — Setup instrumentation

Add the following instrumentation code before your agent is initialized or invoked.
# your_agent.py

# --- Add these imports at the top of your file ---
from opentelemetry import trace
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from openinference.instrumentation.langchain import LangChainInstrumentor
from traceloop.sdk import Traceloop
from langchain.schema import HumanMessage

# --- Initialize the exporter and instrumentation ---
exporter = OTLPSpanExporter()

# Important: Keep this order so spans are properly captured and merged.
LangChainInstrumentor().instrument()
Traceloop.init(app_name="brixo-demo", exporter=exporter)

# --- Your existing agent code below ---
agent = ...  # however your agent is defined or imported

# Wrap your agent call in a top-level span
tracer = trace.get_tracer(__name__)
with tracer.start_as_current_span("Agent"):
    messages = [HumanMessage(content="Add 3 and 4.")]
    response = agent.invoke({"messages": messages})
Tip: Make sure to initialize instrumentation before invoking your agent.
That’s it — your LangChain/LangGraph app is now sending rich, merged agent traces to Brixo.

Troubleshooting

No traces:
  • Check BRIXO_API_KEY and endpoint
  • Ensure telemetry is initialized before other imports

FAQ

Why both OpenLLMetry and OpenInference?

They capture different layers of context — OpenLLMetry focuses on LLM/agent orchestration, OpenInference on tool and framework spans. Brixo merges and normalizes both for full insight.

Do I need a Collector?

No. Brixo supports direct OTLP/HTTP export.

Which frameworks are supported?

Anything OpenLLMetry or OpenInference supports: OpenAI, Anthropic, LangChain, LlamaIndex, CrewAI, AutoGen, Bedrock, etc.