Send data out of PostHog

PostHog provides multiple methods to export and stream your data to external systems, enabling you to activate your product data across your entire tech stack.

1. Batch exports

Schedule regular exports of your data to data warehouses and cloud storage.

  • BigQuery: Export to Google BigQuery for advanced analytics
  • Snowflake: Send data to Snowflake data warehouse
  • Amazon S3: Store data in S3 buckets for flexible processing
  • PostgreSQL: Export to PostgreSQL databases
  • ClickHouse: High-performance analytics database exports

Batch export features:

  • Scheduled exports: Hourly, daily, or custom schedules
  • Incremental updates: Only export new or changed data
  • Historical backfills: Export historical data ranges
  • Custom schemas: Define the structure of exported data
  • Compression: Reduce storage costs with compressed exports

Features

Query events, persons, and insights using HogQL. Build custom integrations and embed analytics in your product

  • HogQL queries: Full SQL access to your data with our ClickHouse wrapper
  • Insight data: Fetch results from saved insights and dashboards
  • Person profiles: Query and update user properties and cohorts
  • Export automation: Schedule exports and integrate with your data stack
  • Aggregations: Get pre-computed metrics without raw event access

Configure batch exports →

2. Realtime event streaming (event pipelines)

Stream events to external systems in real-time as they happen in PostHog.

  • Webhooks: Send events to any HTTP endpoint in real-time
  • Slack: Send events to Slack channels
  • SaaS tools: Send events to SaaS tools like Braze, Customer.io, and more
  • Kafka: Stream events to Apache Kafka topics for high-throughput processing
  • Amazon Kinesis: Direct integration with AWS Kinesis streams
  • Google Pub/Sub: Stream to Google Cloud Pub/Sub topics
  • Custom destinations: Build your own destination using our plugin framework

Real-time streaming is ideal for:

  • Triggering immediate actions based on user behavior
  • Keeping external systems synchronized with PostHog
  • Building real-time dashboards and monitoring
  • Feeding machine learning models with fresh data

Explore event pipeline destinations →

3. Webhooks

Configure webhooks to notify external systems when specific events or conditions occur.

  • Event webhooks: Trigger on specific user actions or system events
  • Threshold alerts: Send notifications when metrics exceed limits
  • Scheduled webhooks: Regular updates on key metrics
  • Custom payloads: Format data to match your destination's requirements

Common webhook use cases:

  • Sync user properties to CRM systems like Salesforce or HubSpot
  • Trigger marketing automation workflows in tools like Braze or Customer.io
  • Update customer success platforms like Vitally or Gainsight
  • Send alerts to Slack, PagerDuty, or monitoring tools

Learn about webhooks →

Export formats and protocols

Data formats

  • JSON: Standard format for webhooks and APIs
  • CSV: For spreadsheet and traditional database imports
  • Parquet: Columnar format for efficient warehouse storage
  • Avro: Schema-based format for streaming platforms

Delivery methods

  • HTTP/HTTPS: RESTful APIs and webhooks
  • SFTP: Secure file transfer for batch exports
  • Cloud storage: Direct writes to S3, GCS, Azure Blob
  • Streaming: Kafka, Kinesis, Pub/Sub protocols

Common export scenarios

Marketing activation

  • Sync product usage data to marketing automation tools
  • Create behavioral segments for targeted campaigns
  • Trigger personalized emails based on user actions

Sales enablement

  • Update CRM with product usage insights
  • Alert sales teams about high-value user activities
  • Score leads based on product engagement

Data science & analytics

  • Feed ML models with behavioral data
  • Export to data lakes for advanced analysis
  • Combine with other data sources in warehouses

Operations & monitoring

  • Send alerts for critical user behaviors
  • Monitor system health and usage patterns
  • Automate support ticket creation

Best practices

  • Start small: Test exports with a subset of data before full deployment
  • Monitor exports: Set up alerts for failed or delayed exports
  • Optimize frequency: Balance data freshness with system load
  • Handle failures: Implement retry logic and error notifications
  • Secure your data: Use encryption and proper authentication

Community questions

Questions about this page? or post a community question.