Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

elasticsearch bulk indexing using python

i am trying to index a csv file with 6M records to elasticsearch using python pyes module,the code reads a record line by line and pushes it to elasticsearch...any idea how i can send this as bulk?

import csv
from pyes import *
import sys

header = ['col1','col2','col3','col3', 'col4', 'col5', 'col6']

conn = ES('xx.xx.xx.xx:9200')

counter = 0

for row in reader:
    #print len(row)
    if counter >= 0:
        if counter == 0:
            pass
        else:
            colnum = 0
            data = {}
            for j in row:
                data[header[colnum]] = str(j)
                colnum += 1
            print data
            print counter
            conn.index(data,'accidents-index',"accidents-type",counter)
    else:
        break

    counter += 1
like image 307
krisdigitx Avatar asked Oct 09 '13 12:10

krisdigitx


1 Answers

pyelasticsearch supports bulk indexing:

bulk_index(index, doc_type, docs, id_field='id', parent_field='_parent'[, other kwargs listed below])

For example,

cities = []
for line in f:
    fields = line.rstrip().split("\t")
    city = { "id" : fields[0], "city" : fields[1] }
    cities.append(cities)
    if len(cities) == 1000:
        es.bulk_index(es_index, "city", cities, id_field="id")
        cities = []
if len(cities) > 0:
    es.bulk_index(es_index, "city", cities, id_field="id")
like image 90
kielni Avatar answered Sep 30 '22 16:09

kielni