I have a table counters
with 2 fields: date
and value
.
I have a big list of objects which need to be inserted into table counters
.
But using serializer.save() for every row in list, there will a lot of inserts and if i have a lot of data, it will take some time until it the data has finished inserting.
To solve this i looked into Django's documentation and i found that there is a function name bulk_create
which can help me to insert into table a list of Objects in just 1 query.
Now, here's my code:
models.py
:
class CounterFileData(models.Model):
date = models.DateTimeField()
value = models.FloatField()
serializers.py
:
class CounterFileDataSerializer(serializers.ModelSerializer):
class Meta:
model = CounterFileData
fields = ['date', 'value']
and the code where i use bulk_create
:
objs = (CounterFileData(date=row.date, value=row.value) for row in parsed_data)
batch = list(parsed_data)
CounterFileData.objects.bulk_create(batch)
row
has the following schema. For example:
{
"date": "2018-12-31T22:00:00"
"value": 9.23129792740622e-05
}
When i try to do that CounterFileData.objects.bulk_create(batch)
i get the following error:
AttributeError: 'dict' object has no attribute 'pk'
Can somebody tell me why it returns no attribute 'pk' ? I'm struggling with this thing some good hours and i still can't find a fix.
Thanks in advance.
You obtain the value in a dictionary for the corresponding key by subscripting, like:
objs = [CounterFileData(date=row['date'], value=row['value']) for row in parsed_data]
Furthermore you passed parsed_data
to the list(..)
constructor, whereas it should be objs
. By using list comprehension however, we can omit that.
batch = [CounterFileData(date=row['date'], value=row['value']) for row in parsed_data]
CounterFileData.objects.bulk_create(batch)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With