I want to take a pg_dump of my data from HasuraDB to my local machine. What is the recommended way to do this ?
Option 1:
Exec a pg_dump
command on your Postgres container:
pg_dump -U username -d dbname > data.sql
In practice, you probably only want the data and not the hasura metadata or schema_migration information. This is a better 'data export' type command:
pg_dump -U username -d dbname --data-only --schema public > data.sql
Option 2:
If Postgres is running as a Docker container:
# Exec into the postgres container
$ docker exec -ti <postgres-id> -- /bin/bash
# Run pg_dump
root@postgres-3391217220-t7bbc:/$ pg_dump -U postgres -d postgres --data-only --schema public > db.sql
^D
# Back on your local machine, copy the dump
$ docker cp <postgres-id>:/db.sql db.sql
Hasura has a pgdump
API command, enabled by default. Here are the docs: https://docs.hasura.io/1.0/graphql/manual/api-reference/pgdump.html
What the docs currently lack is an example. Here's an example I use to dump the Hasura data, but not the schema (which comes from migrations) in order to import it into my locally hosted dev hasura (insert your own hasura-admin-secret
, assuming you use one, or replace with other headers as needed):
HASURA_SECRET=som3_Secr3t_Her3
curl -d '{"-Fc", "opts": ["--data-only"]}' -H "x-hasura-admin-secret: $HASURA_SECRET" https://your-host-url.com/v1alpha1/pg_dump > dumpfile
To export data from a postgres/Hasura instance, we need to use Postman or a curl client to make a http POST request against a hidden Hasura endpoint /v1alpha1/pg_dump
like so:
http://<your-hasura-host>/v1alpha1/pg_dump
An example curl
command:
curl -d '{"opts": ["-a", "-O", "-x", "--inserts", "--exclude-schema=hdb_catalog"], "clean_output": true, "source": "default"}' -H "x-hasura-admin-secret:your-admin-secret-here" http://your-hasura-host/v1alpha1/pg_dump
The above adds a header to specify your Hasura admin secret:
x-hasura-admin-secret:<your admin secret key>
And the -d
adds a JSON body with options ("opts") for pg_dump & Hasura's endpoint handler:
{
"opts": ["-a", "-O", "-x", "--inserts", "--exclude-schema=hdb_catalog"],
"clean_output": true,
"source": "default"
}
The pg_dump options ("opts") above are:
-a
: data-only, not schemas-O
: no ownership statements-x
: no access privilege statements--inserts
: use SQL statements, not psql commands--exclude-schema=hdb_catalog
: no Hasura migration/metadata related records, just our dataThe clean_output
key/value removes a bunch of SET statements & comments from the dump output such as:
--
-- PostgreSQL database dump
--
-- Dumped from database version 12.4 (Debian 12.4-1.pgdg100+1)
-- Dumped by pg_dump version 13.4 (Debian 13.4-4.pgdg100+1)
SET statement_timeout = 0;
SET lock_timeout = 0;
SET idle_in_transaction_session_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SELECT pg_catalog.set_config('search_path', '', false);
SET check_function_bodies = false;
SET xmloption = content;
SET client_min_messages = warning;
SET row_security = off;
source
specifies the name of the database to dump. Usually this would be default
unless you changed it when setting up Hasura.
The output you get should be a single SET
statement followed by a bunch of INSERT INTO
statements.
This can be copy/pasted directly into Hasura Console's Data > SQL tab and run to import the dumped data.
I needed the pg_dump as seed data to use when applying a Hasura migration, from an old instance to a new Hasura instance. But (in my case) the initial migration output by Hasura CLI on the old instance included a bunch of hdb_catalog
create schema statements, which would ensure failure when applying the migration onto a new Hasura instance. (The hdb_catalog schema/tables already exist on a new/fresh Hasura install)
To correct this, I needed to reset migrations on the old instance and redo the migration commands (as shown in the aforementioned link). This new migration output didn't include the hdb_catalog schema, and could be applied OK to the new Hasura instance.
I could then just run the pg_dump output in Hasura Console SQL tab/window on the new instance to repopulate my tables.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With