Traceloop and OpenLLMetry
Contents
There are two ways to bring Traceloop data into PostHog for analysis:
- Direct OpenTelemetry (recommended) – Instrument your app with OpenLLMetry and send OTel spans directly to PostHog.
- Traceloop's managed platform – Use Traceloop's cloud product and enable their PostHog integration.
We also offer a dashboard template to help you quickly get insights into your LLM product.
Option 1: Direct OpenTelemetry (recommended)
PostHog natively supports OTel spans from OpenLLMetry, Traceloop's open-source instrumentation library. OpenLLMetry provides auto-instrumentation for 20+ LLM libraries including OpenAI, Anthropic, LangChain, Cohere, Bedrock, and VertexAI.
When you send OpenLLMetry spans to PostHog, they are automatically:
- Classified as
$ai_generationor$ai_embeddingevents based on the request type - Normalized so input/output messages, token usage, and tools map to PostHog's standard LLM analytics properties
To get started, configure OpenLLMetry to export traces to PostHog's OTel endpoint. See the OpenLLMetry documentation for setup instructions specific to your language and framework.
Option 2: Via Traceloop's managed platform
- Sign up for Traceloop and add it to your app.
- Go to the integrations page in your Traceloop dashboard and click on the PostHog card.

- Enter in your PostHog host and project token (you can find these in your PostHog project settings).
- Select the environment you want to connect to PostHog and click Enable.
Traceloop will now send events to PostHog under the name traceloop span.
Using the Traceloop dashboard template
Once you've installed the integration, our Traceloop dashboard template helps you quickly set up relevant insights. You can see an example dashboard here.
To create your own dashboard from a template:
- Go the dashboard tab in PostHog.
- Click the New dashboard button in the top right.
- Select LLM metrics – Traceloop from the list of templates.