I'm trying to store the following link:
URL = {
hostname: 'i.imgur.com',
webid: 'qkELz.jpg'
}
I want a unique and sparse compound index on these two fields because:
hostname
and webid
should be unique.webid
will always be queried with hostname
.webid
need not be globally unique.URL
need not have a webid.However, when I do this, I get the following error:
MongoError: E11000 duplicate key error index: db.urls.$hostname_1_webid_1 dup key: { : "imgur.com", : null }
I guess in the case of compound indexes, nulls are counted, whereas in regular indexes, they are not.
Any way out of this problem? For now I'm just going to index hostname
and webid
separately.
Keep in mind that mongodb can only use one index per query (it won't join indexes together to make a query on two fields that have separate indexes faster).
That said, if you want to try to check for uniqueness, you could do a query from the app before inserting (which only partially solves the problem, because there's a gap between when you query and when you insert).
You might want to vote on this JIRA issue for filtered indexes, which will probably help your use case: https://jira.mongodb.org/browse/SERVER-785
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With