I use mongoose driver in node js. My schema:
let sendResultSchema = mongoose.Schema({
emailId: String, email: String,
letterId: String, sendedFrom: String,
resultMsg: String, owner: String,
created: Date, result: Boolean,
tag: String, tryNum: Number,
clickHash: String, links: [String]
})
sendResultSchema.index({emailId: 1, letterId: 1, result: 1, owner: 1, tag: 1, clickHash: 1})
let sendResultModel = mongoose.model('sendresult', sendResultSchema)
sendresult collection have a 641000 documents.
this query executing ~0.5 seconds.
db.sendresults.find({"tag" : "tagValue", "letterId" : "5ad630b5949bb02ea07d15d1"}).sort({emailId: -1}).limit(1)
I think it must execute more fast. You can see explain of this query here
How make this query more fast?
The index would need to cover all part of the query (equality part, sort part, and range part). This is because in a typical find()
query, MongoDB only uses one index. E.g., it generally doesn't use one index for the equality part, and another index for the sort part.
In general, the sequence of fields in the index needs to follow the pattern of equality -> sort -> range.
This is described in detail in Optimizing MongoDB Compound Indexes.
For your query, the equality part is tag:..., letterId:...
and the sort part is emailId:-1
. There is no range part in your query.
Using this pattern, the compound index you need is:
db.test.createIndex({tag:1, letterId:1, emailId:-1})
Let's try to confirm how much performance improvement we can get using this index.
To confirm the suitability of the index, I inserted 1 million records into a test database using mgeneratejs, which is a tool to create a random document using a template.
Based on your example, the mgeneratejs
template I'm using is:
$ cat template.json
{
"emailId": "$hash",
"email": "$email",
"letterId": "$hash",
"sendedFrom": "$email",
"resultMsg": "$word",
"owner": "$name",
"created": "$date",
"result": "$bool",
"tag": "$word",
"tryNum": {"$integer": {"min": 0, "max": 1e3}},
"clickHash": "$word",
"links": {"$array": {"of": "$url", "number": {"$integer": {"min": 1, "max": 5}}}}
}
and imported 1 million random documents into MongoDB:
$ mgeneratejs template.json -n 1000000 | mongoimport -d test -c test
Then I create the index you have, and tried to find a non-existent document and gathered 10 runs of the query with the collection containing only this index:
> db.test.createIndex({emailId: 1, letterId: 1, result: 1, owner: 1, tag: 1, clickHash: 1})
> db.test.find({"tag" : "xyz", "letterId" : "abc"}).sort({emailId: -1}).limit(1)
Fetched 0 record(s) in 3069ms
Fetched 0 record(s) in 2924ms
Fetched 0 record(s) in 2923ms
Fetched 0 record(s) in 3013ms
Fetched 0 record(s) in 2917ms
Fetched 0 record(s) in 2961ms
Fetched 0 record(s) in 2882ms
Fetched 0 record(s) in 2870ms
Fetched 0 record(s) in 2969ms
Fetched 0 record(s) in 2863ms
so using that index, the response times of the query are not great, with most execution close to 3 seconds.
By adding the optimal equality -> sort -> range index:
> db.test.createIndex({tag:1, letterId:1, emailId:-1})
> db.test.find({"tag" : "xyz", "letterId" : "abc"}).sort({emailId: -1}).limit(1)
Fetched 0 record(s) in 2ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 1ms
Fetched 0 record(s) in 3ms
In contrast, using the optimal index, performance was markedly improved. No query returned in more than 3ms, with the great majority of time it returns in 1ms.
If you don't need all the fields in the record you can make the query faster by returning just the necessary fields. Like;
db.sendresults.find({"tag" : "tagValue", "letterId" : "5ad630b5949bb02ea07d15d1"},{"_id":0,"tag":1,"email":1}).sort({emailId: -1}).limit(1)
exp. Making field equal to one means returning that field. I made "_id : 0" because I didn't want to get id of the record in mongodb. If you don't add "_id:0" to query it is gonna return the id automatically.
While working with a record with more and nested fields, it saved me a lot of time.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With