Google Pub/Sub

Last updated:

|Edit this page

🚧 Note: New installs of this destination are currently disabled as we develop our new batch export and webhook functionalities. Support their development and follow along on the linked issues.

This destination sends events from PostHog to a Google Cloud Pub/Sub topic when they are ingested. It's used by teams such as Vendasta.


Using this requires either PostHog Cloud with the data pipeline add-on, or a self-hosted PostHog instance running version 1.30.0 or later.

Self-hosting and not running 1.30.0? Find out how to update your self-hosted PostHog deployment.

You'll also need a Google Cloud Pub/Sub account to connect to.


  1. In PostHog, click the "Data pipeline" tab in the left sidebar.
  2. Search for 'Pub/Sub' and select the destination, press Install.
  3. Upload your Google Cloud key .json file. (How to get the file.)
  4. Enter your Topic ID.
  5. Watch events publish to Topic.

Finding your Google Cloud key .json file

You'll need this file to configure the Pub/Sub destination for PostHog. You can find out where to find it in Google's Pub/Sub client libraries documentation.


JSON file with your google cloud key
Type: attachment
Required: True
Topic ID
Type: string
Required: True
A topic will be created if it does not exist.
Events to ignore
Type: string
Required: False
Comma separated list of events to ignore
Maximum upload size in bytes
Type: string
Required: False
Default 1MB. Upload events after buffering this many of them. The value must be between 1 MB and 10 MB.
Export events at least every X seconds
Type: string
Required: False
Default 30 seconds. If there are events to upload and this many seconds has passed since the last upload, then upload the queued events. The value must be between 1 and 600 seconds.


Is the source code for this destination available?

PostHog is open-source and so are all destinations on the platform. The source code is available on GitHub.

Who created this destination?

We'd like to thank PostHog community member Jesse Redl from Vendasta for creating this. Thanks, Jesse!

Who maintains this?

This is maintained by PostHog. If you have issues with it not functioning as intended, please let us know!

What if I have feedback on this destination?

We love feature requests and feedback. Please tell us what you think..

What if my question isn't answered above?

We love answering questions. Ask us anything via our community forum.


Was this page useful?

Next article

Google Cloud Storage

Send events from PostHog to a Google Cloud Storage bucket upon ingestion. You'll also need access to the Google Cloud Storage bucket you want to export to. Installation Before installing the Google Cloud Storage Export destination, you will need your Google Cloud .json file. Find out how to get this in Google's BigQuery API documentation . In PostHog, click the " Data pipeline " tab in the left sidebar. Search for 'GCS' and select the destination, press Install and proceed to Configuration…

Read next article