Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dump database with pg_dump, ignoring tables that we don't have access to

I have a script where pg_dump is failing with a message like this:

pg_dump -h db1 --format plain --encoding UTF8 --schema=public --schema-only --no-owner me
pg_dump: [archiver (db)] query failed: ERROR:  permission denied for relation notmytable
pg_dump: [archiver (db)] query was: LOCK TABLE public.notmytable IN ACCESS SHARE MODE

This is causing the whole dump to abort.

Is there a way to either:

  • Ignore tables that aren't owned by our user?
  • Ignore errors?

I really don't want these tables in the dump, so even if we could get access to them, that wouldn't exactly solve the problem.

(Postgres 9.6.3)

like image 727
Steve Bennett Avatar asked Aug 27 '18 01:08

Steve Bennett


1 Answers

It doesn't appear there is a standard way to do this, but using the --exclude-table flag, we can use a workaround:

export EXCLUDETABLE=$(psql -t -h $HOST -d $DBNAME -c "select '--exclude-table=' || string_agg(tablename,' --exclude-table=') FROM pg_catalog.pg_tables WHERE tableowner NOT LIKE 'myuser';" )

This sets EXCLUDETABLE to look like --exclude-table=foo --exclude-table=blah

Now we pass that to pg_dump:

echo Excluding these tables from dump: $EXCLUDETABLE
pg_dump -h $HOST --format plain --encoding UTF8 --schema=public --schema-only --no-owner $EXCLUDETABLE $DBNAME > public-schema.sql 
like image 121
Steve Bennett Avatar answered Nov 12 '22 06:11

Steve Bennett