Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Committing transactions while executing a postgreql Function

Tags:

I have Postgresql Function which has to INSERT about 1.5 million data into a table. What I want is I want to see the table getting populated with every one records insertion. Currently what is happening when I am trying with say about 1000 records, the get gets populated only after the complete function gets executed. If I stop the function half way through, no data gets populated. How can I make the record committed even if I stop after certain number of records have been inserted?

like image 998
Yousuf Sultan Avatar asked Mar 12 '14 12:03

Yousuf Sultan


1 Answers

This can be done using dblink. I showed an example with one insert being committed you will need to add your while loop logic and commit every loop. You can http://www.postgresql.org/docs/9.3/static/contrib-dblink-connect.html

CREATE OR REPLACE FUNCTION log_the_dancing(ip_dance_entry text) RETURNS INT AS $BODY$     DECLARE     BEGIN         PERFORM dblink_connect('dblink_trans','dbname=sandbox port=5433 user=postgres');         PERFORM dblink('dblink_trans','INSERT INTO dance_log(dance_entry) SELECT ' || '''' || ip_dance_entry || '''');         PERFORM dblink('dblink_trans','COMMIT;');         PERFORM dblink_disconnect('dblink_trans');           RETURN 0;     END; $BODY$   LANGUAGE plpgsql VOLATILE   COST 100;  ALTER FUNCTION log_the_dancing(ip_dance_entry text)   OWNER TO postgres;  BEGIN TRANSACTION;   select log_the_dancing('The Flamingo');   select log_the_dancing('Break Dance');   select log_the_dancing('Cha Cha'); ROLLBACK TRANSACTION;  --Show records committed even though we rolled back outer transaction select * from dance_log; 
like image 137
Kuberchaun Avatar answered Oct 14 '22 04:10

Kuberchaun