I am having a data insertion problem in tables linked by foreign key. I have read in some places that there is a "with" command that helps in these situations, but I do not quite understand how it is used.
I would like to put together four tables that will be used to make a record, however, that all the data were inserted at once, in a single query, and that they were associated with the last table, to facilitate future consultations. Here is the code for creating the tables:
CREATE TABLE participante
(
id serial NOT NULL,
nome character varying(56) NOT NULL,
CONSTRAINT participante_pkey PRIMARY KEY (id),
);
CREATE TABLE venda
(
id serial NOT NULL,
inicio date NOT NULL,
CONSTRAINT venda_pkey PRIMARY KEY (id)
);
CREATE TABLE item
(
id serial NOT NULL,
nome character varying(256) NOT NULL,
CONSTRAINT item_pkey PRIMARY KEY (id)
);
CREATE TABLE lances_vendas
(
id serial NOT NULL,
venda_id integer NOT NULL,
item_id integer NOT NULL,
participante_id integer NOT NULL,
valor numeric NOT NULL,
CONSTRAINT lance_vendas_pkey PRIMARY KEY (id),
CONSTRAINT lances_vendas_venda_id_fkey FOREIGN KEY (venda_id)
REFERENCES venda (id),
CONSTRAINT lances_vendas_item_id_fkey FOREIGN KEY (item_id)
REFERENCES item (id),
CONSTRAINT lances_vendas_participante_id_fkey FOREIGN KEY (participante_id)
REFERENCES participante (id)
);
The T-SQL function OUTPUT, which was introduced in 2005, can be used to insert multiple values into multiple tables in a single statement. The output values of each row that was part of an INSERT, UPDATE or DELETE operation are returned by the OUTPUT clause.
PostgreSQL INSERT Multiple Rows First, specify the name of the table that you want to insert data after the INSERT INTO keywords. Second, list the required columns or all columns of the table in parentheses that follow the table name. Third, supply a comma-separated list of rows after the VALUES keyword.
When using Postgres if you do need writes exceeding 10,000s of INSERT s per second we turn to the Postgres COPY utility for bulk loading. COPY is capable of handling 100,000s of writes per second. Even without a sustained high write throughput COPY can be handy to quickly ingest a very large set of data.
To retrieve information from more than one table, you need to join those tables together. This can be done using JOIN methods, or you can use a second SELECT statement inside your main SELECT query—a subquery.
The idea is to write WITH
clauses that contain INSERT ... RETRUNING
to return the generated keys. Then these “views for a single query” can be used to insert those keys into the referencing tables.
WITH par_key AS
(INSERT INTO participante (nome) VALUES ('Laurenz') RETURNING id),
ven_key AS
(INSERT INTO venda (inicio) VALUES (current_date) RETURNING id),
item_key AS
(INSERT INTO item (nome) VALUES ('thing') RETURNING id)
INSERT INTO lances_vendas (venda_id, item_id, participante_id, valor)
SELECT ven_key.id, item_key.id, par_key.id, numeric '3.1415'
FROM par_key, ven_key, item_key;
I know that you requested a single query, but you may still want to consider using a transaction:
BEGIN;
INSERT INTO participante (nome) VALUES ('Laurenz');
INSERT INTO venda (inicio) VALUES (current_date);
INSERT INTO item (nome) VALUES ('thing');
INSERT INTO lances_vendas (venda_id, item_id, participante_id, valer)
VALUES (currval('venda_id_seq'), currval('item_id_seq'), currval('participante_id_seq'), 3.1415);
COMMIT;
The transaction ensures that any new row in participante, venda and item leave the value of currval('X') unchanged.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With