Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to store an ordered set of documents in MongoDB without using a capped collection

What's a good way to store a set of documents in MongoDB where order is important? I need to easily insert documents at an arbitrary position and possibly reorder them later.

I could assign each item an increasing number and sort by that, or I could sort by _id, but I don't know how I could then insert another document in between other documents. Say I want to insert something between an element with a sequence of 5 and an element with a sequence of 6?

My first guess would be to increment the sequence of all of the following elements so that there would be space for the new element using a query something like db.items.update({"sequence":{$gte:6}}, {$inc:{"sequence":1}}). My limited understanding of Database Administration tells me that a query like that would be slow and generally a bad idea, but I'm happy to be corrected.

I guess I could set the new element's sequence to 5.5, but I think that would get messy rather quickly. (Again, correct me if I'm wrong.)

I could use a capped collection, which has a guaranteed order, but then I'd run into issues if I needed to grow the collection. (Yet again, I might be wrong about that one too.)

I could have each document contain a reference to the next document, but that would require a query for each item in the list. (You'd get an item, push it onto the results array, and get another item based on the next field of the current item.) Aside from the obvious performance issues, I would also not be able to pass a sorted mongo cursor to my {#each} spacebars block expression and let it live update as the database changed. (I'm using the Meteor full-stack javascript framework.)

I know that everything has it's advantages and disadvantages, and I might just have to use one of the options listed above, but I'd like to know if there is a better way to do things.

like image 862
BonsaiOak Avatar asked Oct 06 '14 16:10

BonsaiOak


1 Answers

Based on your requirement, one of the approaches could be to design your schema, in such a way that each document has the capability to hold more than one document and in itself act as a capped container.

{
  "_id":Number,
  "doc":Array
}

Each document in the collection will act as a capped container, and the documents will be stored as array in the doc field. The doc field being an array, will maintain the order of insertion. You can limit the number of documents to n. So the _id field of each container document will be incremental by n, indicating the number of documents a container document can hold.

By doing these you avoid adding extra fields to the document, extra indices, unnecessary sorts.

Inserting the very first record

i.e when the collection is empty.

var record = {"name" : "first"};
db.col.insert({"_id":0,"doc":[record]});

Inserting subsequent records

  • Identify the last container document's _id, and the number of documents it holds.
  • If the number of documents it holds is less than n, then update the container document with the new document, else create a new container document.

Say, that each container document can hold 5 documents at most,and we want to insert a new document.

var record = {"name" : "newlyAdded"};

// using aggregation, get the _id of the last inserted container, and the 
// number of record it currently holds.
db.col.aggregate( [ {
    $group : {
        "_id" : null,
        "max" : {
            $max : "$_id"
        },
        "lastDocSize" : {
            $last : "$doc"
        }
    }
}, {
    $project : {
        "currentMaxId" : "$max",
        "capSize" : {
            $size : "$lastDocSize"
        },
        "_id" : 0
    }
// once obtained, check if you need to update the last container or 
// create a new container and insert the document in it.
} ]).forEach( function(check) {
    if (check.capSize < 5) {
        print("updating");
        // UPDATE
        db.col.update( {
            "_id" : check.currentMaxId
        }, {
            $push : {
                "doc" : record
            }
        });
    } else {
        print("inserting");
        //insert
        db.col.insert( {
            "_id" : check.currentMaxId + 5,
            "doc" : [ record ]
        });
    }
})

Note that the aggregation, runs on the server side and is very efficient, also note that the aggregation would return you a document rather than a cursor in versions previous to 2.6. So you would need to modify the above code to just select from a single document rather than iterating a cursor.

Inserting a new document in between documents

Now, if you would like to insert a new document between documents 1 and 2, we know that the document should fall inside the container with _id=0 and should be placed in the second position in the doc array of that container.

so, we make use of the $each and $position operators for inserting into specific positions.

var record = {"name" : "insertInMiddle"};

db.col.update(
{
    "_id" : 0
}, {
    $push : {
        "doc" : {
            $each : [record],
            $position : 1
        }
    }
}
);

Handling Over Flow

Now, we need to take care of documents overflowing in each container, say we insert a new document in between, in container with _id=0. If the container already has 5 documents, we need to move the last document to the next container and do so till all the containers hold documents within their capacity, if required at last we need to create a container to hold the overflowing documents.

This complex operation should be done on the server side. To handle this, we can create a script such as the one below and register it with mongodb.

db.system.js.save( {
    "_id" : "handleOverFlow",
    "value" : function handleOverFlow(id) {
        var currDocArr = db.col.find( {
            "_id" : id
        })[0].doc;
        print(currDocArr);
        var count = currDocArr.length;
        var nextColId = id + 5;
        // check if the collection size has exceeded
    if (count <= 5)
        return;
    else {
        // need to take the last doc and push it to the next capped 
    // container's array
    print("updating collection: " + id);
    var record = currDocArr.splice(currDocArr.length - 1, 1);
    // update the next collection
    db.col.update( {
        "_id" : nextColId
    }, {
        $push : {
            "doc" : {
                $each : record,
                $position : 0
            }
        }
    });
    // remove from original collection
    db.col.update( {
        "_id" : id
    }, {
        "doc" : currDocArr
    });
    // check overflow for the subsequent containers, recursively.
    handleOverFlow(nextColId);
}
}

So that after every insertion in between , we can invoke this function by passing the container id, handleOverFlow(containerId).

Fetching all the records in order

Just use the $unwind operator in the aggregate pipeline.

db.col.aggregate([{$unwind:"$doc"},{$project:{"_id":0,"doc":1}}]);

Re-Ordering Documents

You can store each document in a capped container with an "_id" field:

.."doc":[{"_id":0,","name":"xyz",...}..]..

Get hold of the "doc" array of the capped container of which you want to reorder items.

var docArray = db.col.find({"_id":0})[0];

Update their ids so that after sorting the order of the item will change.

Sort the array based on their _ids.

docArray.sort( function(a, b) {
    return a._id - b._id;
});

update the capped container back, with the new doc array.

But then again, everything boils down to which approach is feasible and suits your requirement best.

Coming to your questions:

What's a good way to store a set of documents in MongoDB where order is important?I need to easily insert documents at an arbitrary position and possibly reorder them later.

Documents as Arrays.

Say I want to insert something between an element with a sequence of 5 and an element with a sequence of 6?

use the $each and $position operators in the db.collection.update() function as depicted in my answer.

My limited understanding of Database Administration tells me that a query like that would be slow and generally a bad idea, but I'm happy to be corrected.

Yes. It would impact the performance, unless the collection has very less data.

I could use a capped collection, which has a guaranteed order, but then I'd run into issues if I needed to grow the collection. (Yet again, I might be wrong about that one too.)

Yes. With Capped Collections, you may lose data.

like image 96
BatScream Avatar answered Nov 15 '22 22:11

BatScream