Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the alternative to cursors to retrieve large amounts of data from a database?

Searching through stackoverflow I found a large number of answers condemning the use of cursors in database programming. However I don't really understand what the alternative is.

I'm creating a program which reads a large number of rows (hundreds of thousands) from the database and keeps them in memory, for performance reasons. I can't really run a SELECT * FROM table and process all the results at once, can I?

The best way I have found is to use cursors and retrieve the rows in increments, for example 10 at a time.

Could someone enlighten me? I use PostgreSQL 9 on linux.

Thanks

like image 518
nib0 Avatar asked Oct 26 '22 01:10

nib0


2 Answers

A CURSOR is the best option when you have to deal with large amounts of data. You could also use the LIMIT .. OFFSET .. method, but this will get slower and slower, depending on the amount of data. PostgreSQL doesn't have any problems with cursors, use them when you handle large amounts of data.

SQL Server has/had problems with cursors, MySQL can't handle cursors outside stored functions, that might be the reason some dba's don't like cursors.

like image 65
Frank Heikens Avatar answered Nov 15 '22 05:11

Frank Heikens


You can straightly use the for loop using record:

do
$$
declare r record;

begin

    for r in select product_id, name from products loop
        raise notice '% %', r.product_id, r.jname;
    end loop;

end$$
like image 41
Michael Buen Avatar answered Nov 15 '22 06:11

Michael Buen