First of all, I've gone through several similar questions before posting this one and none have seemed to be accurate to my needs or I haven't been able to interpret them in a way that would work for me.
I am working with aggregation in MongoDB (with NodeJS and Mongoose) to perform some pagination and also give some data like total and average.
This is my pipeline so far:
[
{
$match: {
// Some filtering criteria here
}
},
{ $facet: {
metadata: [
{ $count: 'total' }
],
avg: [
{
$group: {
_id: null,
avg_price: {
$avg: "$price"
}
}
}
],
data: [
{ $sort: { createdDate: -1 }},
{ $skip: skip || 0 },
{ $limit: limit }
]
}
}
]
Which is giving me an output with the following structure:
[
{
"metadata": [
{
"total": 14
}
],
"avg": [
{
"_id": null,
"avg_price": 936711.3571428572
}
],
"data": [
// the returned data according to $match, $sort, $skip and $limit
]
}
]
I have to send that data to the front end but that structure is not suitable for my needs. I am using GraphQL and I would prefer so send something like the following (without that array-object-array kind of nesting):
{
total: 14,
avg_price: 936711.3571428572,
data: [
// the returned data according to $match, $sort, $skip and $limit
]
}
I could indeed have some Javascript logic to extract that data from the aggregation result and generate the expected output but it would required dirty code like:
avg_price: aggr_result[0].avg[0].avg_price
And I want to avoid that.
I was wondering what would be the MongoDB way to to this kind of formatting in the pipeline.
Thanks for your time.
Just use one $project stage at the end of the pipeline
[
{ "$match": { ... }},
{ "$facet": {
"metadata": [
{ "$count": "total" }
],
"avg": [
{ "$group": {
"_id": null,
"avg_price": { "$avg": "$price" }
}}
],
"data": [
{ "$sort": { "createdDate": -1 }},
{ "$skip": skip || 0 },
{ "$limit": limit }
]
}
},
{ "$project": {
"total": { "$arrayElemAt": ["$metadata.total", 0] },
"avg_price": { "$arrayElemAt": ["$avg.avg_price", 0] },
"data": 1,
}}
]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With