I want to group records by _id
and create a string by combining client_id
values.
Here are examples of my documents:
{
"_id" : ObjectId("59e955e633d64c81875bfd2f"),
"tag_id" : 1,
"client_id" : "10001"
}
{
"_id" : ObjectId("59e955e633d64c81875bfd30"),
"tag_id" : 1,
"client_id" : "10002"
}
I'd like to have this output:
{
"_id" : 1
"client_id" : "10001,10002"
}
To concatenate strings in MySQL with GROUP BY, you need to use GROUP_CONCAT() with a SEPARATOR parameter which may be comma(') or space (' ') etc.
You can concatenate rows into single string using COALESCE method. This COALESCE method can be used in SQL Server version 2008 and higher. All you have to do is, declare a varchar variable and inside the coalesce, concat the variable with comma and the column, then assign the COALESCE to the variable.
You can concatenate a list of strings into a single string with the string method, join() . Call the join() method from 'String to insert' and pass [List of strings] . If you use an empty string '' , [List of strings] is simply concatenated, and if you use a comma , , it makes a comma-delimited string.
The MySQL GROUP_CONCAT () function is an aggregate function that concatenates strings from a group into a single string with various options. The following shows the syntax of the GROUP_CONCAT () function: GROUP_CONCAT ( DISTINCT expression ORDER BY expression SEPARATOR sep );
Note that GROUP_CONCAT () function concatenates string values in different rows while the CONCAT_WS () or CONCAT () function concatenates two or more string values in different columns. The GROUP_CONCAT () function returns a single string, not a list of values.
You can use the following basic syntax to concatenate strings from using GroupBy in pandas: This particular formula groups rows by the group_var column and then concatenates the strings in the string_var column. The following example shows how to use this syntax in practice.
Group the data using Dataframe.groupby () method whose attributes you need to concatenate. Concatenate the string by using the join function and transform the value of that column using lambda statement.
You can do it with the aggregation framework as a "two step" operation. Which is to first accumulate the items to an array via $push
withing a $group
pipeline, and then to use $concat
with $reduce
on the produced array in final projection:
db.collection.aggregate([
{ "$group": {
"_id": "$tag_id",
"client_id": { "$push": "$client_id" }
}},
{ "$addFields": {
"client_id": {
"$reduce": {
"input": "$client_id",
"initialValue": "",
"in": {
"$cond": {
"if": { "$eq": [ "$$value", "" ] },
"then": "$$this",
"else": {
"$concat": ["$$value", ",", "$$this"]
}
}
}
}
}
}}
])
We also apply $cond
here to avoid concatenating an empty string with a comma in the results, so it looks more like a delimited list.
FYI There is an JIRA issue SERVER-29339 which does ask for $reduce
to be implemented as an accumulator expression to allow it's use directly in a $group
pipeline stage. Not likely to happen any time soon, but it theoretically would replace $push
in the above and make the operation a single pipeline stage. Sample proposed syntax is on the JIRA issue.
If you don't have $reduce
( requires MongoDB 3.4 ) then just post process the cursor:
db.collection.aggregate([
{ "$group": {
"_id": "$tag_id",
"client_id": { "$push": "$client_id" }
}},
]).map( doc =>
Object.assign(
doc,
{ "client_id": doc.client_id.join(",") }
)
)
Which then leads to the other alternative of doing this using mapReduce
if you really must:
db.collection.mapReduce(
function() {
emit(this.tag_id,this.client_id);
},
function(key,values) {
return [].concat.apply([],values.map(v => v.split(","))).join(",");
},
{ "out": { "inline": 1 } }
)
Which of course outputs in the specific mapReduce
form of _id
and value
as the set of keys, but it is basically the output.
We use [].concat.apply([],values.map(...))
because the output of the "reducer" can be a "delimited string" because mapReduce
works incrementally with large results and therefore output of the reducer can become "input" on another pass. So we need to expect that this can happen and treat it accordingly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With