Is there any way to import a JSON file (contains 100 documents) in elasticsearch server? I want to import a big json file into es-server..
json file is set to module . Use an import assertion to import the JSON file. For example, import myJson from './example. json' assert {type: 'json'} .
Elasticsearch only supports JSON. If you want to send something else you need to transform it. You can use logstash or whatever other system (even your code).
As dadoonet already mentioned, the bulk API is probably the way to go. To transform your file for the bulk protocol, you can use jq.
Assuming the file contains just the documents itself:
$ echo '{"foo":"bar"}{"baz":"qux"}' | jq -c ' { index: { _index: "myindex", _type: "mytype" } }, . ' {"index":{"_index":"myindex","_type":"mytype"}} {"foo":"bar"} {"index":{"_index":"myindex","_type":"mytype"}} {"baz":"qux"}
And if the file contains the documents in a top level list they have to be unwrapped first:
$ echo '[{"foo":"bar"},{"baz":"qux"}]' | jq -c ' .[] | { index: { _index: "myindex", _type: "mytype" } }, . ' {"index":{"_index":"myindex","_type":"mytype"}} {"foo":"bar"} {"index":{"_index":"myindex","_type":"mytype"}} {"baz":"qux"}
jq's -c
flag makes sure that each document is on a line by itself.
If you want to pipe straight to curl, you'll want to use --data-binary @-
, and not just -d
, otherwise curl will strip the newlines again.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With