Searching through stackoverflow I found a large number of answers condemning the use of cursors in database programming. However I don't really understand what the alternative is.
I'm creating a program which reads a large number of rows (hundreds of thousands) from the database and keeps them in memory, for performance reasons. I can't really run a SELECT * FROM table and process all the results at once, can I?
The best way I have found is to use cursors and retrieve the rows in increments, for example 10 at a time.
Could someone enlighten me? I use PostgreSQL 9 on linux.
Thanks
A CURSOR is the best option when you have to deal with large amounts of data. You could also use the LIMIT .. OFFSET .. method, but this will get slower and slower, depending on the amount of data. PostgreSQL doesn't have any problems with cursors, use them when you handle large amounts of data.
SQL Server has/had problems with cursors, MySQL can't handle cursors outside stored functions, that might be the reason some dba's don't like cursors.
You can straightly use the for loop using record:
do
$$
declare r record;
begin
for r in select product_id, name from products loop
raise notice '% %', r.product_id, r.jname;
end loop;
end$$
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With