Prior to starting a historical data migration, ensure you do the following:
- Create a project on our US or EU Cloud.
- Sign up to a paid product analytics plan on the billing page (historic imports are free but this unlocks the necessary features).
- Set the
historical_migration
option totrue
when capturing events in the migration.
This guide is relevant to users wanting to migrate both:
- From a self-hosted PostHog instance to PostHog Cloud.
- Between PostHog Cloud regions (e.g. US -> EU Cloud).
Requirements
- An existing project, either on PostHog Cloud or on a self-hosted instance running at least
1.30.0
. For upgrade instructions, take a look at this guide. - A new PostHog Cloud project hosted in the region of your choice.
Approach
This migration has 3 steps:
Migrate your metadata (projects, dashboards, insights, actions, cohorts, feature flags, experiments, annotations).
Migrate your events. This also creates the necessary person, person distinct ID, and related records.
Switch tracking in your product to set up replication from the old project if needed and to start sending events to the new project.
Migrate your metadata
To migrate metadata like projects, dashboards, insights, actions, feature flags, and more, use the PostHog migrate metadata script. This requires:
- Installing TypeScript and
ts-node
. You can do this by runningnpm install -g typescript ts-node
in your terminal. - Your old instance personal API key with read access to the project.
- Your new cloud instance personal API key with write access to the project, which you can get from your project settings.
Note: This process has the following caveats:
- Every object's "created by" information will appear as if it was created by the user who created the personal API key.
- Every object's "created at" information will appear as if it was created at the time you ran this script.
- Clone the repo and
cd
into itTerminalgit clone https://github.com/PostHog/posthog-migrate-metacd posthog-migrate-meta - Install the dependencies by running
yarn
- Run the scriptTerminalts-node index.ts --source [posthog instance you want to migrate from] --sourcekey [personal api key for that instance] --destination [posthog instance you want to migrate to.] --destinationkey [personal api key for destination instance]
For more information on the options see the migrate metadata repo's readme.
Migrate your events to Cloud
Before you start, transformations and filtering apps in the destination cloud project (e.g. GeoIP). Keeping these enabled may change the events you are migrating.
For more details about historical migrations, see our migration docs.
Migrating events from self-hosted to Cloud
To migrate your events, you can read data directly from your ClickHouse cluster and ingest the data with the Python library using our self-hosted migration tool.
First, clone the repo and install the requirements.
git clone https://github.com/PostHog/posthog-migration-toolscd posthog-migration-toolspip3 install -r requirements.txt
Next, run the migration script with your ClickHouse details, PostHog details, start date, end date, and fetch limit.
python3 ./migrate.py \--clickhouse-url https://some.clickhouse.cluster:8443 \--clickhouse-user default \--clickhouse-password some-password \--clickhouse-database posthog \--team-id 1234 \--posthog-url https://us.posthog.com \--posthog-api-token "abx123" \--start-date 2023-06-18T13:00:00Z \--end-date 2023-06-18T13:10:00 \--fetch-limit 10000
This script prints a "cursor" in case the migration fails. It can be used to resume from where it got to by adding the --cursor
argument to the command above.
python3 ./migrate.py \--clickhouse-url https://some.clickhouse.cluster:8443 \--clickhouse-user default \--clickhouse-password some-password \--clickhouse-database posthog \--team-id 1234 \--posthog-url https://us.posthog.com \--posthog-api-token "abx123" \--start-date 2023-06-18T13:00:00Z \--end-date 2023-06-18T13:10:00 \--fetch-limit 10000 \--cursor the-cursor-value-from-the-output
Notes:
- This script adds a
$lib
property ofposthog-python
, overriding any$lib
property already set.- If the script fails for some reason, just run it again with the latest cursor. Some transient issues are solved by re-running the script.
Migrating events between Cloud instances (e.g. US -> EU Cloud)
You must raise a support ticket in-app with the Data pipelines topic for the PostHog team to do this migration for you. This option is only available to customers on the team or enterprise plan as it requires significant engineering time.
Switching tracking in your product
Now that we've migrated our events, the next step is to switch over tracking within your product to direct any new events to your new PostHog Cloud instance.
Note: To make sure your person properties get the latest values, don't switch over tracking until historical events have been migrated.
Re-enable any apps that you disabled earlier (e.g. GeoIP).
Begin swapping out your project API key and instance address in the product or site. Once done, events using the new API key and host will go to your new Cloud instance.
Migrating your custom transformations or destinations
For custom transformations:
Check if we already have a transform that does what you need (fastest option). You can see the list of transformations here.
Move this logic to your app before you send the event (also fast).
If you can make your app generalizable enough that others can benefit, submit your app to the store. To do this, see the build docs.
For custom destinations:
Check to see if we already have a destination or batch export that does what you need (fastest option). You can see the list of destinations and batch exports here.
Convert your app to work as a webhook (also fast). These are currently in beta. See the details here.
If you can make your app generalizable enough that others can benefit, submit your app to the store. To do this, see the build docs.
If the options above don't work and you were previously paying a substantial amount self-hosting, then email us at sales@posthog.com with a link to the public GitHub repo and we can see if it's appropriate as a private cloud app.