I would like to batch upload a json file to dynamodb. At the moment I can successfully manually put items in a python file (as below) and upload to a table, however how can I amend the script to read an external json file (containing 200 items) and batch upload all 200 items to the table.
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('exampletable')
with table.batch_writer() as batch:
batch.put_item(
Item={
'ID': '2',
'DateTime': '21/12/2017 13:16',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
batch.put_item(
Item={
'ID': '3',
'DateTime': '21/12/2017 13:40',
'SourceDevice': '10',
'DestinationDevice': '20',
'DataType': 'full',
'Activity': 'unusual'
}
)
json file contents as below
[{
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "10",
"DestinationDevice": "20",
"DataType": "part",
"Activity": "normal"
}, {
"ID": "1",
"DateTime": "21/12/2017 13:16",
"SourceDevice": "40",
"DestinationDevice": "25",
"DataType": "full",
"Activity": "unusual"
}]
You would simply break that down into two tasks:
batch.put_item each time. There are lots of results when you do a search for the first task. And the second task is literally just writing a loop.
A full solution would look something like this:
import json
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('exampletable')
# Read the JSON file
with open('items.json') as json_data:
items = json.load(json_data)
with table.batch_writer() as batch:
# Loop through the JSON objects
for item in items:
batch.put_item(Item=item)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With