Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Navigating cursor rows in SQLite (Can we rewind/reset the cursor i.e. go back to first row for example?)

I am trying to understand how the following builtin functions work when sequentially processing cursor rows. The descriptions come from the Python 3.1 manual (using SQLite3)

Cursor.fetchone()

Fetches the next row of a query result set, returning a single sequence.

Cursor.fetchmany()

Fetches the next set of rows of a query result, returning a list.

Cursor.fetchall()

Fetches all (remaining) rows of a query result, returning a list.

So if I have a loop in which I am processing one row at a time using cursor.fetchone(), and some later code requires that I return to the first row, or fetch all rows using fetchall(), how do I do it?

The concept is a bit strange to me, especially coming from a Foxpro background which has the concept of a record pointer which can be moved to the 1st or last row in a cursor (go top/bottom), or go to the nth row (go n)

Any help would be appreciated.

Alan

like image 445
Alan Harris-Reid Avatar asked May 09 '10 03:05

Alan Harris-Reid


1 Answers

The SQLite interface in Python 3.1 is based on PEP 249, which only specifies that cursors have to support sequential access to the records of a query result. There's no way to go back. If you need to return to a previously fetched row, you should save it when you first fetch it, e.g. create a list of the fetched data (or actually, just use fetchall). Then you can work with the list and go back and forth between rows as much as you want.

The idea behind the design of the DB API is to support efficient execution of code where you only need to process each row once. A typical loop looks like this:

for a,b,c in cursor.fetchone():
    # process it

This way, the cursor can discard the references to each row's data once the corresponding iteration of the loop is finished. It doesn't really lose anything because if you want to keep all the data, you can always make a list out of it, but applications dealing with large data sets still have the ability to process rows one at a time. It's kind of like the reasoning behind using generator expressions rather than lists when you need to process elements of an iterable one by one.

like image 116
David Z Avatar answered Sep 19 '22 21:09

David Z