Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Elasticsearch bulk index in chunks using PyEs

I have a simple python script for indexing a CSV file containing 1 million rows:

import csv
from pyes import *

reader = csv.reader(open('data.csv', 'rb'))

conn = ES('127.0.0.1:9200', timeout=20.0)

counter = 0
for row in reader:
        try:
                data = {"name":row[5]}
                conn.index(data,'namesdb',counter, bulk=True)
                counter += 1
        except:
                pass

This works quite well but as we go into the thousands, it all slows down exponentially.

I'm guessing if I did the index in smaller chunks ES will perform better.

Is there a more efficient way of doing this? Would a sleep() delay help? or is there an easy way to break up the csv into smaller chunks programmatically?

Thanks.

like image 345
GivP Avatar asked Jan 17 '23 16:01

GivP


2 Answers

You can adjust the bulk size when you create the ES instance. Something like this:

conn = ES('127.0.0.1:9200', timeout=20.0, bulk_size=100)

The default bulk size is 400. That is, pyes is sending the bulk contents automatically when you got 400 documents in the bulk. If you want to send the bulk before bulk_size we reached (e.g.: before exiting), you can call conn.flush_bulk(forced=True)

I'm not sure if refreshing the index manually at every Nth document would be the best bet. Elasticsearch does it automatically by default, each second. What you can do is to increase that time. Something like this:

curl -XPUT localhost:9200/namesdb/_settings -d '{
    "index" : {
        "refresh_interval" : "3s"
    }
}'

Or, you can refresh manually, like Dragan suggested, but in this case it might make sense to disable Elasticsearch's auto-refresh by setting the interval to "-1". But you don't need to refresh every X documents, you can refresh after you finished inserting all of them.

More details here: http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/indices-update-settings.html

Please note that refreshing is quite expensive, and in my experience you're better off with either: - letting Elastisearch to do the refreshes in the background - disabling the refresh altogether and re-enabling it after I've finished inserting the whole bunch of documents

like image 160
Radu Gheorghe Avatar answered Jan 25 '23 21:01

Radu Gheorghe


on every Nth count run

es.refresh()

example here

like image 37
AnalyticsBuilder Avatar answered Jan 25 '23 22:01

AnalyticsBuilder