I am new to elasticsearch and have huge data(more than 16k huge rows in the mysql table). I need to push this data into elasticsearch and am facing problems indexing it into it. Is there a way to make indexing data faster? How to deal with huge data?
You will make a POST request to the /_bulk
Your payload will follow the following format where \n
is the newline character.
action_and_meta_data\n
optional_source\n
action_and_meta_data\n
optional_source\n
...
Make sure your json is not pretty printed
The available actions are index
, create
, update
and delete
.
To answer your question, if you just want to bulk load data into your index.
{ "create" : { "_index" : "test", "_type" : "type1", "_id" : "3" } }
{ "field1" : "value3" }
The first line contains the action and metadata. In this case, we are calling create
. We will be inserting a document of type type1
into the index named test
with a manually assigned id of 3
(instead of elasticsearch auto-generating one).
The second line contains all the fields in your mapping, which in this example is just field1
with a value of value3
.
You will just concatenate as many as these as you'd like to insert into your index.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With