Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mongo aggregation and MongoError: exception: BufBuilder attempted to grow() to 134217728 bytes, past the 64MB limit

Tags:

mongodb

I'm trying to aggregate data from my Mongo collection to produce some statistics for FreeCodeCamp by making a large json file of the data to use later.

I'm running into the error in the title. There doesn't seem to be a lot of information about this, and the other posts here on SO don't have an answer. I'm using the latest version of MongoDB and drivers.

I suspect there is probably a better way to run this aggregation, but it runs fine on a subset of my collection. My full collection is ~7GB.

I'm running the script via node aggScript.js > ~/Desktop/output.json Here is the relevant code:

MongoClient.connect(secrets.db, function(err, database) {
  if (err) {
    throw err;
  }

  database.collection('user').aggregate([
    {
      $match: {
        'completedChallenges': {
          $exists: true
        }
      }
    },
    {
      $match: {
        'completedChallenges': {
          $ne: ''
        }
      }
    },
    {
      $match: {
        'completedChallenges': {
          $ne: null
        }
      }
    },
    {
      $group: {
        '_id': 1, 'completedChallenges': {
          $addToSet: '$completedChallenges'
        }
      }
    }
  ], {
    allowDiskUse: true
  }, function(err, results) {
    if (err) { throw err; }
    var aggData = results.map(function(camper) {
      return _.flatten(camper.completedChallenges.map(function(challenges) {
        return challenges.map(function(challenge) {
          return {
            name: challenge.name,
            completedDate: challenge.completedDate,
            solution: challenge.solution
          };
        });
      }), true);
    });
    console.log(JSON.stringify(aggData));
    process.exit(0);
  });
});
like image 321
tkbyte Avatar asked Dec 25 '15 23:12

tkbyte


2 Answers

Aggregate returns a single document containing all the result data, which limits how much data can be returned to the maximum BSON document size.

Assuming that you do actually want all this data, there are two options:

  • Use aggregateCursor instead of aggregate. This returns a cursor rather than a single document, which you can then iterate over
  • add a $out stage as the last stage of your pipeline. This tells mongodb to write your aggregation data to the specified collection. The aggregate command itself returns no data and you then query that collection as you would any other.
like image 162
Frederick Cheung Avatar answered Sep 18 '22 13:09

Frederick Cheung


It just means that the result object you are building became too large. This kind of issue should not be impacted by the version. The fix implemented for 2.5.0 only prevents the crash from occurring.

You need to filter ($match) properly to have the data which you need in result. Also group with proper fields. The results are put into buffer of 64MB. So reduce your data. $project only the columns you require in result. Not whole documents.

You can combine your 3 $match objects to single to reduce pipelines.

{
  $match: {
    'completedChallenges': {
       $exists: true,
       $ne: null,
       $ne: ""
    }
  }
}
like image 33
Somnath Muluk Avatar answered Sep 16 '22 13:09

Somnath Muluk