I'm evaluating how well the MongoDB aggregation framework suits our needs as we are currently running on top of SQL Server. I'm having a hard time performing a specific query:
Say I have the following pseudo records (modeled as columns in a sql table and as a full document in a mongodb collection)
{
name: 'A',
timespent: 100,
},
{
name: 'B',
timespent: 200,
},
{
name: 'C',
timespent: 300,
},
{
name: 'D',
timespent: 400,
},
{
name: 'E',
timespent: 500,
}
I want to group the timespent field in to ranges and count the occurrences so I will get e.g. the following pseudo-records:
results{
0-250: 2,
250-450: 2,
450-650: 1
}
Note that these ranges (250, 450 and 650) are dynamic and will likely be altered over time by the user. In SQL we extracted the results with something like this:
select range, COUNT(*) as total from (
select case when Timespent <= 250 then '0-250'
when Timespent <= 450 then '200-450'
else '450-600' end as range
from TestTable) as r
group by r.range
Again, note that this sql is constructed dynamically by our app to fit the specific ranges available at any one time.
I'm struggling to find the appropriate constructs in the mongodb aggregation framework to perform such queries. I can query for the results of a single range by inserting a $match into the pipeline(i.e. getting the result of a single range) but I cannot grok how to extract all the ranges and their counts in a single pipeline query.
In MongoDB, aggregation operations process the data records/documents and return computed results. It collects values from various documents and groups them together and then performs different types of operations on that grouped data like sum, average, minimum, maximum, etc to return a computed result.
The aggregate() Method For the aggregation in MongoDB, you should use aggregate() method.
The Aggregation command is slower than the find command. If you access to the data like ToList() the aggregation command is faster than the find.
what corresponds to the "case" SQL statement in the aggregation framework, is the $cond operator (see manual). $cond statements can be nested to simulate "when-then" and "else", but I have chosen another approach, because it is easier to read (and to generate, see below): I'll use the $concat operator to write the range string, which then serves as grouping key.
So for the given collection:
db.xx.find()
{ "_id" : ObjectId("514919fb23700b41723f94dc"), "name" : "A", "timespent" : 100 }
{ "_id" : ObjectId("514919fb23700b41723f94dd"), "name" : "B", "timespent" : 200 }
{ "_id" : ObjectId("514919fb23700b41723f94de"), "name" : "C", "timespent" : 300 }
{ "_id" : ObjectId("514919fb23700b41723f94df"), "name" : "D", "timespent" : 400 }
{ "_id" : ObjectId("514919fb23700b41723f94e0"), "name" : "E", "timespent" : 500 }
the aggregate (hardcoded) looks like this:
db.xx.aggregate([
{ $project: {
"_id": 0,
"range": {
$concat: [{
$cond: [ { $lte: ["$timespent", 250] }, "range 0-250", "" ]
}, {
$cond: [ { $and: [
{ $gte: ["$timespent", 251] },
{ $lt: ["$timespent", 450] }
] }, "range 251-450", "" ]
}, {
$cond: [ { $and: [
{ $gte: ["$timespent", 451] },
{ $lt: ["$timespent", 650] }
] }, "range 450-650", "" ]
}]
}
}},
{ $group: { _id: "$range", count: { $sum: 1 } } },
{ $sort: { "_id": 1 } },
]);
and the result is:
{
"result" : [
{
"_id" : "range 0-250",
"count" : 2
},
{
"_id" : "range 251-450",
"count" : 2
},
{
"_id" : "range 450-650",
"count" : 1
}
],
"ok" : 1
}
In order to generate the aggregate command, you have to build the "range" projection as a JSON object ( or you could generate a string and then use JSON.parse(string) )
The generator looks like this:
var ranges = [ 0, 250, 450, 650 ];
var rangeProj = {
"$concat": []
};
for (i = 1; i < ranges.length; i++) {
rangeProj.$concat.push({
$cond: {
if: {
$and: [{
$gte: [ "$timespent", ranges[i-1] ]
}, {
$lt: [ "$timespent", ranges[i] ]
}]
},
then: "range " + ranges[i-1] + "-" + ranges[i],
else: ""
}
})
}
db.xx.aggregate([{
$project: { "_id": 0, "range": rangeProj }
}, {
$group: { _id: "$range", count: { $sum: 1 } }
}, {
$sort: { "_id": 1 }
}]);
which will return the same result as above.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With