Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pg_dump to dump table schema with constraint as compact

My pg_dump returning the constraints are in alter query at the end of the dump file. I want those to be inside the create table (...) section.

For example. I have created a table as below with a constraint:

CREATE TABLE "test_constraints" (
    "id" serial NOT NULL,
    "user_id" bigint NOT NULL,
    PRIMARY KEY ("id"),
    CONSTRAINT "user_id" UNIQUE ("user_id")
);

And taking the schema dump using the following command:

pg_dump.exe -U postgres -t "test_constraints" -f "D:\dump.sql" "postgres"

Is it possible to have the table schema as it is(or near to that) when I have created it? I mean the constraints need to be inside the create table(...);

like image 290
Sabuj Hassan Avatar asked Dec 05 '13 08:12

Sabuj Hassan


People also ask

What does the pg_dump command do?

pg_dump can be used to backup an entire database, then pg_restore can be used to examine the archive and/or select which parts of the database are to be restored. The most flexible output file formats are the “custom” format ( -Fc ) and the “directory” format ( -Fd ).

What is pg_dump and Pg_restore?

pg_restore is a utility for restoring a PostgreSQL database from an archive created by pg_dump in one of the non-plain-text formats. It will issue the commands necessary to reconstruct the database to the state it was in at the time it was saved.

Does pg_dump lock table?

pg_dump doesn't lock the entire database, it does get an explicit lock on all the tables it is going to dump, though.

What is the difference between pg_dump and Pg_dumpall?

One caveat: pg_dump does not dump roles or other database objects including tablespaces, only a single database. To take backups on your entire PostgreSQL cluster, pg_dumpall is the better choice. pg_dumpall can handle the entire cluster, backing up information on roles, tablespaces, users, permissions, etc…


1 Answers

No, it is not possible. Things are done this way for a reason, actually.

The problem is that a dump is usually intended to be bulk loaded, and so you want to create indexes after the loading of the data. This means, effectively, you want to do this in three steps:

  1. Define the table
  2. Load the data
  3. create the indexes

Now the problem is that certain constraints are internally handled through indexes and so they need to be created in stage 3. I.e. you want to create primary keys and unique constraints after the bulk load, for performance reasons.

like image 87
Chris Travers Avatar answered Oct 13 '22 07:10

Chris Travers