I have a current Mongo database with the accumulated entries/fields
{
name: "Fred Flintstone",
age : 34,
timeStamp : NumberLong(14283454353543)
}
{
name: "Wilma Flintstone",
age : 33,
timeStamp : NumberLong(14283454359453)
}
And so on...
Question : I want to convert all entries in the database to their corresponding ISODate instead - How does one do this?
Desired Result :
{
name: "Fred Flintstone",
age : 34,
timeStamp : ISODate("2015-07-20T14:50:32.389Z")
}
{
name: "Wilma Flintstone",
age : 33,
timeStamp : ISODate("2015-07-20T14:50:32.389Z")
}
Things I've tried
>db.myCollection.find().forEach(function (document) {
document["timestamp"] = new Date(document["timestamp"])
//Not sure how to update this document from here
db.myCollection.update(document) //?
})
You can specify a particular date by passing an ISO-8601 date string with a year within the inclusive range 0 through 9999 to the new Date() constructor or the ISODate() function. These functions accept the following formats: new Date("<YYYY-mm-dd>") returns the ISODate with the specified date.
BSON has a special timestamp type for internal MongoDB use and is not associated with the regular Date type. This internal timestamp type is a 64 bit value where: the most significant 32 bits are a time_t value (seconds since the Unix epoch)
Using the aggregation pipeline for update operations, simply run the following update operation:
db.myCollection.updateMany(
{ },
[
{ $set: {
timeStamp: {
$toDate: '$timeStamp'
}
} },
]
])
With you initial attempt, you were almost there, you just need to call the save()
method on the modified document to update it since the method uses either the insert
or the update
command. In the above instance, the document contains an _id
fieldand thus the save()
method is equivalent to an update()
operation with the upsert option set to true and the query predicate on the _id
field:
db.myCollection.find().snapshot().forEach(function (document) {
document["timestamp"] = new Date(document["timestamp"]);
db.myCollection.save(document)
})
The above is similar to explicitly calling the update()
method as you had previously attempted:
db.myCollection.find().snapshot().forEach(function (document) {
var date = new Date(document["timestamp"]);
var query = { "_id": document["_id"] }, /* query predicate */
update = { /* update document */
"$set": { "timestamp": date }
},
options = { "upsert": true };
db.myCollection.update(query, update, options);
})
For relatively large collection sizes, your db performance will be slow and it's recommended to use mongo bulk updates for this:
MongoDB versions >= 2.6 and < 3.2:
var bulk = db.myCollection.initializeUnorderedBulkOp(),
counter = 0;
db.myCollection.find({"timestamp": {"$not": {"$type": 9 }}}).forEach(function (doc) {
bulk.find({ "_id": doc._id }).updateOne({
"$set": { "timestamp": new Date(doc.timestamp") }
});
counter++;
if (counter % 1000 === 0) {
// Execute per 1000 operations
bulk.execute();
// re-initialize every 1000 update statements
bulk = db.myCollection.initializeUnorderedBulkOp();
}
})
// Clean up remaining operations in queue
if (counter % 1000 !== 0) bulk.execute();
MongoDB version 3.2 and newer:
var ops = [],
cursor = db.myCollection.find({"timestamp": {"$not": {"$type": 9 }}});
cursor.forEach(function (doc) {
ops.push({
"updateOne": {
"filter": { "_id": doc._id } ,
"update": { "$set": { "timestamp": new Date(doc.timestamp") } }
}
});
if (ops.length === 1000) {
db.myCollection.bulkWrite(ops);
ops = [];
}
});
if (ops.length > 0) db.myCollection.bulkWrite(ops);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With