Syncing Data Between a Platform and Your Data Source

If you want Loops to analyze customer data that resides within an analytics platform, you can integrate that data into Loops without any coding or R&D resources.

However, you’ll first need to sync that data to an external data source, which Loops will then connect to. This article provides some information about the process, and we’ll be happy to guide you through it – just talk to your integration specialist. (We can also help you set up a database or data warehouse if you don’t already have one.)

Once the data has been synced to a database or data warehouse, the final step is to connect Loops to your data source.

Which platforms are supported?

Loops currently supports the following analytics platforms:

* Support depends on your platform access and tier.

If your organization uses a platform that’s not on this list, please schedule a call or talk to your Loops integration specialist.

How to sync data from your platform

The below information will help you sync data from your analytics platform to a database or data warehouse so Loops can gain access to it.

If you have any questions or need assistance, be sure to get in touch with us.

Amplitude

Amplitude currently offers three solutions for exporting raw events from the platform:

  1. Export API – An API endpoint to retrieve all event data from a particular date range.
  2. Amplitude Query – Query raw data via your Amplitude-managed Snowflake database.
  3. Amplitude ETL – A managed pipeline to extract data from Amplitude and load it into storage buckets

Here are some important details about each of these methods:


Export API Amplitude Query Amplitude ETL
Management Non-managed – you need to build your own pipeline.* Managed – Loops will query the data warehouse that’s behind Amplitude. Partially managed – you must load data from a bucket to your data warehouse.*
Data ownership You own your data. You have access to a managed data warehouse. You own your data.
Pricing Included in all Amplitude plans** Not included; incurs additional costs. Included in all paid plans**
Pull frequency (from Amplitude to data warehouse) Under your control Hourly or daily Hourly or daily

* Loops can manage this if you use Google Cloud Platform and BigQuery or AWS Athena.

** Data warehouse solutions incur additional costs.

Since the ability to query large volumes of data from Amplitude is limited, we recommend that you first sync your data to a dedicated data warehouse. This is generally a simple process since Loops has a structured integration with Amplitude. Loops can support the ETL process of moving the data to your warehouse – just reach out for more details.

Braze

Retrieving events from Braze requires you to have the Braze Currents tool. Contact Braze support to find out if it’s included in your current plan.

Once you have access to Currents, three options are available for exporting Braze data to a data warehouse:

  • Exporting to Snowflake
  • Exporting to BigQuery via Google Cloud Platform
  • Exporting to Amazon Athena via AWS S3

Send your product’s user ID to Braze to ensure you can accurately join Braze data with your product usage data.

Firebase

Firebase allows you to transfer analytics data to BigQuery with the click of a button. This functionality is available for both free and paid plans.

When you link your Firebase project to BigQuery, Firebase exports a copy of your existing data to BigQuery Export. Firebase also sets up a daily sync of data from your project to BigQuery.

Loops can set up a BigQuery account for you and facilitate the above process. Simply follow these steps:

  1. Set up a Google Cloud account (see instructions). This process only takes a couple of minutes. Be sure to include billing details in your account.
  2. Provide integrations@getloops.ai with access to your Firebase account. If you prefer to transfer the data yourself, follow these export instructions.

Google Analytics

Google Analytics (GA) allows you to transfer analytics data to BigQuery with the click of a button. This functionality is available for both free and paid plans.

When you link your GA project to BigQuery, GA exports a copy of your existing data to BigQuery Export. GA also sets up a daily sync of data from your project to BigQuery.

Loops can set up a BigQuery account for you and facilitate the above process. Simply follow these steps:

  1. Set up a Google Cloud account (see instructions). This process only takes a couple of minutes. Be sure to include billing details in your account.
  2. Provide integrations@getloops.ai with access to your Google Analytics account. If you prefer to transfer the data yourself, follow these export instructions.

Heap

Heap offers Heap Connect, a managed ETL process that transfers data to your data warehouse with a few clicks of a button. This is a paid add-on, but it’s free for users with a Premier plan.

Loops can set up the integration for you. Talk to your integration specialist for more details.

Mixpanel

Mixpanel currently offers three solutions for exporting raw events from the platform:

  • Raw Export API – An API endpoint to retrieve all event data from a particular date range.
  • Raw export pipelines – a scheduled export of all events directly to destination buckets (AWS S3 or Google Cloud Services).
  • Export directly to data warehouse – A managed process by Mixpanel that allows you to export and load raw event data into BigQuery, Athena or Snowflake.

Raw Export API Raw Export Pipeline Export directly to Data Warehouse
Management Non-managed – you need to build a pipeline yourself. Partially managed – you must load data from a bucket to your data warehouse.* Fully managed
Data ownership You own your data. You own your data. You have view access to a managed data warehouse.
Pricing Free, and Loops can manage this process for you** Available in Growth and Enterprise plans; requires Data Pipelines add-on (free 30-day trial available)** Available in Growth and Enterprise plans; requires Data Pipelines add-on (free 30-day trial available)**
Pull frequency (from Mixpanel to data warehouse) Under your control; some limitations apply Hourly or daily Hourly or daily

* Loops can manage this if you use Google Cloud Platform and BigQuery.

** Data warehouse solutions incur additional costs.

Since the ability to query large volumes of data from Mixpanel is limited, we recommend that you first sync your data to a dedicated data warehouse. This is generally a simple process since Loops has a structured integration with Mixpanel. Loops can support the ETL process of moving the data to your warehouse – just reach out for more detail.

Pendo

Pendo supports integration with all popular data warehouses, such as BigQuery, Redshift and Snowflake. Integration is available as a paid add-on for all Pendo subscriptions.

Loops can set up the integration for you; talk to your integration specialist for details.

Posthog

Posthog has an integration with all popular data warehouses, such as BigQuery, Redshift and Snowflake.

For any questions or additional support, talk to your Loops integration specialist.

Segment

There are currently two options for transferring data from Segment to a data warehouse:

  1. Sync your Segment profile to your data warehouse; Segment supports all popular warehouses (see details). If you don’t have a data warehouse yet, we’ll be happy to set one up for you – just let us know.
  2. Transfer the data to Loop’s database, which is built on BigQuery (see details about this process).

Splunk

Although Splunk offers certain storage capabilities and data-retention policies, we recommend transferring data to a dedicated data warehouse for Loops analysis.

Please contact your integration specialist for more information.

Still need help? Contact Us Contact Us