I have a large set off files (hdf) that I need to enable search for. For Java I would use Lucene for this, as it's a file and document indexing engine. I don't know what the python equivalent would be though.
Can anyone recommend which library I should use for indexing a large collection of files for fast search? Or is the prefered way to roll your own?
I have looked at pylucene and lupy, but both projects seem rather inactive and unsupported, so I am not sure if should rely on them.
Final notes: Woosh and pylucene seems promising, but woosh is still alpha so I am not sure I want to rely on it, and I have problems compiling pylucene, and there are no actual releases off it. After I have looked a bit more at the data, it's mostly numbers and default text strings, so as off now an indexing engine won't help me. Hopefully these libraries will stabilize and later visitors will find some use for them.
Lupy has been retired and the developers recommend PyLucene instead. As for PyLucene, its mailing list activity may be low, but it is definitely supported. In fact, it just recently became an official apache subproject.
You may also want to look at a new contender: Whoosh. It's similar to lucene, but implemented in pure python.
I haven't done indexing before, however the following may be helpful :-
As far as using HDF files goes, I have heard of a module called h5py.
I hope this helps.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With