I have been trying to put data in elastic search through java
using the following code:
String url = "http://localhost:9200/testindex2/test/2";
HttpClient client = new DefaultHttpClient();
HttpPut put = new HttpPut(url);
JSONObject json = new JSONObject();
json.put("email", "[email protected]");
json.put("first_name", "abc");
StringEntity se = new StringEntity("JSON: " + json.toString());
se.setContentEncoding(new BasicHeader(HTTP.CONTENT_TYPE,"Text"));
put.setEntity(se);
HttpResponse response = client.execute(put);
System.out.println("\nSending 'PUT' request to URL : " + url);
System.out.println("Put parameters : " + put.getEntity());
System.out.println("Response Code : " + response.getStatusLine().getStatusCode());
BufferedReader rd = new BufferedReader(new InputStreamReader(response.getEntity().getContent()));
StringBuffer result = new StringBuffer();
String line = "";
while ((line = rd.readLine()) != null) {
result.append(line);
}
System.out.println(result.toString());
And I am getting the following error:
Sending 'PUT' request to URL : http://localhost:9200/testindex2/test/2 Put parameters : [Content-Type: text/plain; charset=ISO-8859-1,Content- Encoding: Text,Content-Length: 52,Chunked: false] Response Code : 400 {"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"failed to parse"}],"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"not_x_content_exception","reason":"Compressor detection can only be called on some xcontent bytes or compressed xcontent bytes"}},"status":400}
Also when I try the same code from a rest client it runs just fine, not sure why this problem is happening.
The Index API that is being used in this example does not support a content-type of text/plain, so Elasticsearch 6.0 will reject the request without performing any updates. We might attempt to work around those content-type checks by changing our test page to send a valid Content-Type such as JSON.
Let’s start by taking a look at some of the recurring errors and exceptions that most Elasticsearch users are bound to encounter at one point or another. 1. Mapper_parsing_exception Elasticsearch relies on mapping, also known as schema definitions, to handle data properly, according to its correct data type.
By default, an Elasticsearch server does not include any of those CORS headers in the response, so the cross-origin request fails, and our web-page is prevented from seeing the results of the POST. But by then the damage has already been done - the request was sent to the Elasticsearch cluster and the document has been stored.
The Elasticsearch engineering team is busy working on features for Elasticsearch 6.0. One of the changes that is coming in Elasticsearch 6.0 is strict content-type checking . What’s changing? Starting from Elasticsearch 6.0, all REST requests that include a body must also provide the correct content-type for that body.
Replaced
StringEntity se = new StringEntity("JSON: " + json.toString());
se.setContentEncoding(new BasicHeader(HTTP.CONTENT_TYPE,"Text"));
with this:
StringEntity se = new StringEntity(json.toString(),ContentType.APPLICATION_JSON);
and its working now
Elastic search has special client to work with Java. And you don't need to generate JSON manually. Moreover you didn't describe import section, so a bit hard to understand what libraries you use.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With