I have a big DBF file (~700MB). I'd like to select only a few lines from it using a python script. I've seen that dbfpy is a nice module that allows to open this type of database, but for now I haven't found any querying capability. Iterating through all the elements from python is simply too slow.
Can I do what I want from python in a reasonable time?
Using my dbf module you can create temporary indexes and then search using those:
import dbf
table = dbf.Table('big.dbf')
index = table.create_index(lambda rec: rec.field) # field should be actual field name
records = index.search(match=('value',))
Creating the index may take a few seconds, but the searches after that are extremely quick.
Chances are, your performance is more I/O bound than CPU bound. As such, the best way to speed it up is to optimize your search. You probably want to build some kind of index keyed by whatever your search predicate is.
If you are using Windows you can use the odbc module in combination with the Visual FoxPro ODBC Driver
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With