Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Dump postgres data with indexes

I've got a Postgres 9.0 database which frequently I took data dumps of it.

This database has a lot of indexes and everytime I restore a dump postgres starts background task vacuum cleaner (is that right?). That task consumes much processing time and memory to recreate indexes of the restored dump.

My question is:

  1. Is there a way to dump the database data and the indexes of that database?
  2. If there is a way, will worth the effort (I meant dumping the data with the indexes will perform better than vacuum cleaner)?
  3. Oracle has some the "data pump" command a faster way to imp and exp. Does postgres have something similar?

Thanks in advance, Andre

like image 620
AndreDurao Avatar asked Sep 30 '11 12:09

AndreDurao


2 Answers

If you use pg_dump twice, once with --schema-only, and once with --data-only, you can cut the schema-only output in two parts: the first with the bare table definitions and the final part with the constraints and indexes. Something similar can probably be done with pg_restore.

like image 53
wildplasser Avatar answered Nov 13 '22 13:11

wildplasser


Best Practice is probably to

  • restore the schema without indexes
  • and possibly without constraints,
  • load the data,
  • then create the constraints,
  • and create the indexes.

If an index exists, a bulk load will make PostgreSQL write to the database and to the index. And a bulk load will make your table statistics useless. But if you load data first, then create the index, the stats are automatically up to date.

We store scripts that create indexes and scripts that create tables in different files under version control. This is why.

In your case, changing autovacuum settings might help you. You might also consider disabling autovacuum for some tables or for all tables, but that might be a little extreme.

like image 40
Mike Sherrill 'Cat Recall' Avatar answered Nov 13 '22 13:11

Mike Sherrill 'Cat Recall'