I am using itertools to group by a dictionary key using the below:
host_data = []
for k,v in itertools.groupby(temp_data, key=lambda x:x['device_id'])
    d = {}
    for dct in v:
        d.update(dct)
    host_data.append(d) 
However I would like to group by 'device_id' and 'port_id' if possible, how can I add an additional key to the grouping?
Just use a tuple as key:
itertools.groupby(temp_data, key=lambda x:(x['device_id'], x['port_id']))
                        Make the key a tuple: key=lambda x: (x['device_id'], x['port_id']).
You can use itemgetter to make a 2-tuple grouping key:
from operator import itemgetter
...
itertools.groupby(temp_data, key=itemgetter('device_id', 'port_id'))
                        Just to expand a little bit on the accepted answer:
Once you use a tuple as a grouping key, you can retrieve the tuple values when iterating the grouped data, in case you need it, e.g.:
from itertools import groupby
data = [
    {"device_id": 1, "port_id": 1, "some_value": 1},
    {"device_id": 1, "port_id": 2, "some_value": 2},
    {"device_id": 1, "port_id": 2, "some_value": 3},
    {"device_id": 2, "port_id": 1, "some_value": 1},
    {"device_id": 2, "port_id": 1, "some_value": 2},
    {"device_id": 3, "port_id": 1, "some_value": 1},
]
grouped_data = groupby(data, key=lambda x: (x["device_id"], x["port_id"]))
for (device_id, port_id), group in grouped_data:
    print(f"device_id, port_id: {device_id}, {port_id}")
    print(f"  -> group: {list(group)}")
which would print:
device_id, port_id: 1, 1
  -> group: [{'device_id': 1, 'port_id': 1, 'some_value': 1}]
device_id, port_id: 1, 2
  -> group: [{'device_id': 1, 'port_id': 2, 'some_value': 2}, {'device_id': 1, 'port_id': 2, 'some_value': 3}]
device_id, port_id: 2, 1
  -> group: [{'device_id': 2, 'port_id': 1, 'some_value': 1}, {'device_id': 2, 'port_id': 1, 'some_value': 2}]
device_id, port_id: 3, 1
  -> group: [{'device_id': 3, 'port_id': 1, 'some_value': 1}]
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With