I am trying to run a delete query to delete documents between two timestamps in an index, and I am getting a very strange result.
Here is my code:
// how the index is created
if (!es.IndexExists(indexName).Exists)
{
es.CreateIndex(descriptor => descriptor
.Index(indexName)
.AddMapping<MyDocument>(m => m
.MapFromAttributes()));
}
// event object that is mapped
public class MyDocument
{
public long Id { get; set; }
public long EventTime { get; set; }
[ElasticProperty(Index = FieldIndexOption.NotAnalyzed)]
public string EventType { get; set; }
public DateTime CreatedAt { get; set; }
public IDictionary<string, object> Values { get; set; }
// snip
}
// delete call
IElasticClient es;
es.DeleteByQuery<MyDocument>(q => q
.Query(rq => rq
.Range(t => t.OnField("eventTime").GreaterOrEquals(startTimestamp).LowerOrEquals(endTimestamp)));
This throws an exception saying "An item with the same key has already been added". What am I doing wrong with this delete query that would throw this exception?
Here is a sample document from a search I did through elasticsearch:
{
"took" : 8,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 96,
"max_score" : 1.0,
"hits" : [ {
"_index" : "testing2",
"_type" : "mydocument",
"_id" : "112",
"_score" : 1.0,
"_source":{"id":112,"eventTime":12345690,"eventDate":"1970-05-23T17:21:30-04:00","eventTypeId":0,"ready":false,"name":"","doccount":0,"wordcount":0,"createdAt":"2015-06-25T09:29:33.8996707-04:00","values":{"internal_timestamp":76890.0},"childDocuments":[],"parentDocuments":[]}
}, /* snip */]
}
}
Rob and I just worked this out.
The problem that was occurring here is that my project is using Newtonsoft.Json
version 7.0.0
with Nest version 1.5.1
, whereas Nest 1.5.1
requires 6.0.1
. This mismatch was causing the serialization of queries to throw exceptions.
This can be solved by either upgrading Nest to version 1.6.1
or downgrading Newtonsoft.Json
to version 6.0.1
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With