Linking Azure as a source

Last updated:

|Edit this page

The data warehouse can link to data in your Azure storage accounts.

  1. Create an Azure storage account
  2. Create a blob container
  3. Upload data and link to PostHog

Step 1: Create an Azure storage account

Firstly, log into Azure and go to Storage Accounts, then create a storage account by following this Azure guide.

Step 2: Create a blob container

Once the storage account has been created, follow this guide to create a blob container.

container creation

Upload your data to the newly created container, Parquet files are the recommended format, but the connector also works with JSON and CSVs too.

Find the newly created file via the storage browser menu item. Once found, open the details and copy URL property. We need it to link the file in PostHog.

copy blob file

  1. Go to the Data pipeline page and the sources tab in PostHog
  2. Click New source and select Azure from the self managed section
  3. Enter a name for your dataset and paste the copied URL into the "Files URL pattern" box
  4. Select the correct format for your data
  5. Enter the storage account name (this is name of the storage account you created in step 1)
  6. Find and paste your storage account key - you can use this Azure doc to view your access keys

linking your data in posthog

That's it! You should be able to query the data from the PostHog SQL editor.

Questions?

Was this page useful?

Next article

Linking Cloudflare R2 as a source

The data warehouse can link to data in Cloudflare R2. To start, you'll need to: Create a bucket in R2 Set up an access key and secret Add data to the bucket Create the table in PostHog Step 1: Creating a bucket in R2 Log in to the Cloudflare dashboard . Go to the R2 overview and create a new bucket. We suggest using Eastern North America as a location hint if you're using PostHog Cloud US or European Union as a specific jurisdiction if you're using PostHog Cloud EU. In your bucket, upload…

Read next article