Integrating with Helicone

Last updated:

|Edit this page

You can integrate with Helicone and bring data into PostHog for analysis. Additionally, we offer a dashboard template to help you quickly get insights into your LLM product.

How to install the integration

  1. Sign up for Helicone and add it to your app.
  2. Similar to how you set Helicone-Auth header when configuring your LLM client, add two new headers Helicone-Posthog-Key and Helicone-Posthog-Host with your PostHog host and API key (you can find these in your PostHog project settings):
client = OpenAI(
api_key="your-api-key-here", # Replace with your OpenAI API key
base_url="https://oai.hconeai.com/v1", # Set the API endpoint
default_headers= {
"Helicone-Auth": f"Bearer {HELICONE_API_KEY}",
"Helicone-Posthog-Key": "<ph_project_api_key>",
"Helicone-Posthog-Host": "https://us.i.posthog.com",
}
)

Helicone events will now be exported into PostHog as soon as they're available.

Using the Helicone dashboard template

Once you've installed the integration, dashboard templates help you quickly set up relevant insights. You can see an example Helicone dashboard here.

To create your own dashboard from a template:

  1. Go the dashboard tab in PostHog.
  2. Click the New dashboard button in the top right.
  3. Select LLM metrics – Helicone from the list of templates.

Questions? Ask Max AI.

It's easier than reading through 571 docs articles.

Community questions

Was this page useful?

Next article

Integrating with Langfuse

You can integrate with Langfuse and bring data into PostHog for analysis. Additionally, we offer a dashboard template to help you quickly get insights into your LLM product. How to install the integration First add Langfuse Tracing to your LLM app ( Quickstart ). In your Langfuse dashboard , click on Settings and scroll down to the Integrations section to find the PostHog integration. Click Configure and paste in your PostHog host and project API key (you can find these in your…

Read next article