Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does mongo provide functionality for deconstructing document arrays for large datasets?

Tags:

mongodb

Similar to map/reduce but in reverse. Does mongo have a way of reformatting data. I have a collection in the following format.

{ 
  {"token-id" : "LKJ8_lkjsd"
    "data": [
               {"views":100, "Date": "2015-01-01"},
               {"views":200, "Date": "2015-01-02"},
               {"views":300, "Date": "2015-01-03"},
               {"views":300, "Date": "2015-01-03"}
            ]
  }
}

I would like to process the entire collection into a new format. where every time series data point is its document mapped to the ID hopefully using some inherent mongo functionality similar to map reduce. If there isn't; I'd appreciate a strategy in which we can do this.

{
  { "token-id" : "LKJ8_lkjsd", "views": 100, "Date" : "2015-01-01"},
  { "token-id" : "LKJ8_lkjsd", "views": 200, "Date" : "2015-01-01"},
  { "token-id" : "LKJ8_lkjsd", "views": 300, "Date" : "2015-01-01"}
}
like image 640
Dap Avatar asked Jan 08 '23 12:01

Dap


1 Answers

The aggregate command can return results as a cursor or store the results in a collection, which are not subject to the size limit. The db.collection.aggregate() returns a cursor and can return result sets of any size.

 var result = db.test.aggregate( [ { $unwind : "$data" }, {$project: {_id:0, "token-id":1, "data":1}}])

    for(result.hasNext()){
     db.collection.insert(result.next());
    }
like image 77
Rohit Jain Avatar answered May 06 '23 08:05

Rohit Jain