LLM analytics is currently considered in beta
. To access it, enable the feature preview in your PostHog account.
- 1
Install the PostHog SDK
RequiredSetting up analytics starts with installing the PostHog SDK for your language. LLM analytics works best with our Python and Node SDKs.
pip install posthog - 2
Install LangChain and OpenAI SDKs
RequiredInstall the LangChain and OpenAI Python SDKs:
pip install langchain openai langchain-openaiProxy noteThese SDKs do not proxy your calls, they only fire off an async call to PostHog in the background to send the data.
You can also use LLM analytics with other SDKs or our API, but you will need to capture the data manually via the capture method. See schema in the manual capture section for more details.
- 3
Initialize PostHog and LangChain
RequiredIn the spot where you make your OpenAI calls, import PostHog, LangChain, and our LangChain
CallbackHandler
. Initialize PostHog with your project API key and host from your project settings, and pass it to theCallbackHandler
.Optionally, you can provide a user distinct ID, trace ID, PostHog properties, groups, and privacy mode.
from posthog.ai.langchain import CallbackHandlerfrom langchain_openai import ChatOpenAIfrom langchain_core.prompts import ChatPromptTemplatefrom posthog import Posthogposthog = Posthog("<ph_project_api_key>",host="https://us.i.posthog.com")callback_handler = CallbackHandler(client=posthog, # This is an optional parameter. If it is not provided, a default client will be used.distinct_id="user_123", # optionaltrace_id="trace_456", # optionalproperties={"conversation_id": "abc123"} # optionalgroups={"company": "company_id_in_your_db"} # optionalprivacy_mode=False # optional)Note: If you want to capture LLM events anonymously, don't pass a distinct ID to the
CallbackHandler
. See our docs on anonymous vs identified events to learn more. - 4
Call LangChain
RequiredWhen you invoke your chain, pass the
callback_handler
in theconfig
as part of yourcallbacks
:prompt = ChatPromptTemplate.from_messages([("system", "You are a helpful assistant."),("user", "{input}")])model = ChatOpenAI(openai_api_key="your_openai_api_key")chain = prompt | model# Execute the chain with the callback handlerresponse = chain.invoke({"input": "Tell me a joke about programming"},config={"callbacks": [callback_handler]})print(response.content)This automatically captures many properties into PostHog including
$ai_input
,$ai_input_tokens
,$ai_latency
,$ai_model
,$ai_model_parameters
,$ai_output_choices
, and$ai_output_tokens
. It also automatically creates a trace hierarchy based on how LangChain components are nested.