I'm actually working in a search engine project. We are working with python + mongoDb.
I have a pymongo cursor after excecuting a find() command to the mongo db. The pymongo cursor has around 20k results.
I have noticed that the iteration over the pymongo cursor is really slow compared with a normal iteration over for example a list of the same size.
I did a little benchmark:
The difference is really a lot. Maybe not a problem with this amounts of results, but if I have millions of results the time would be unacceptable.
Has anyone got an idea of why pymongo cursors are too slow to iterate? Any idea of how can I iterate the cursor in less time?
Some extra info:
Remember the pymongo driver is not giving you back all 20k results at once. It is making network calls to the mongodb backend for more items as you iterate. Of course it wont be as fast as a list of strings. However, I'd suggest trying to adjust the cursor batch_size as outlined in the api docs:
Is your pymongo installation using the included C extensions?
>>> import pymongo
>>> pymongo.has_c()
True
I spent most of last week trying to debug a moderate-sized query and corresponding processing that took 20 seconds to run. Once the C extensions were installed, the whole same process took roughly a second.
To install the C extensions in Debian, install the python development headers before running easy install. In my case, I also had to remove the old version of pymongo. Note that this will compile a binary from C, so you need all the usual tools. (GCC, etc)
# on ubuntu with pip
$ sudo pip uninstall pymongo
$ sudo apt-get install python-dev build-essential
$ sudo pip install pymongo
the default cursor size is 4MB, and the maximum it can go to is 16MB. you can try to increase your cursor size until that limit is reached and see if you get an improvement, but it also depends on what your network can handle.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With