I have to store some message in ElasticSearch integrate with my python program. Now what I try to store the message is:
d={"message":"this is message"} for index_nr in range(1,5): ElasticSearchAPI.addToIndex(index_nr, d) print d
That means if I have 10 messages then I have to repeat my code 10 times. So what I want to do is try to make a script file or batch file. I've checked the ElasticSearch Guide, BULK API is possible to use. The format should be something like below:
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } } { "field1" : "value1" } { "delete" : { "_index" : "test", "_type" : "type1", "_id" : "2" } } { "create" : { "_index" : "test", "_type" : "type1", "_id" : "3" } } { "field1" : "value3" } { "update" : {"_id" : "1", "_type" : "type1", "_index" : "index1"} } { "doc" : {"field2" : "value2"} }
what I did is:
{"index":{"_index":"test1","_type":"message","_id":"1"}} {"message":"it is red"} {"index":{"_index":"test2","_type":"message","_id":"2"}} {"message":"it is green"}
I also use curl tool to store the doc.
$ curl -s -XPOST localhost:9200/_bulk --data-binary @message.json
Now I want to use my Python code to store the file to the Elastic Search.
Load CSV to elasticsearch python code.Import Elasticsearch client and helpers functions from elasticsearch package. Also, import csv module. Create the elasticsearch client, which will connect to Elasticsearch. Then, open the CSV file as DictReader of csv module and bulk upload to ealsticsearch.
doc_type – The type of the document; deprecated and optional starting with 7.0. _source – True or false to return the _source field or not, or a list of fields to return.
from datetime import datetime from elasticsearch import Elasticsearch from elasticsearch import helpers es = Elasticsearch() actions = [ { "_index": "tickets-index", "_type": "tickets", "_id": j, "_source": { "any":"data" + str(j), "timestamp": datetime.now()} } for j in range(0, 10) ] helpers.bulk(es, actions)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With