I have multiple tables in Amazon DynamoDB, JSON Data is currently uploaded into the tables using the batch-write-item
command that is available as part of AWS CLI - this works well.
However I would like to use just Python + Boto3 but have not been able to execute the Boto BatchWriteItem
request with an external data file as input. I envision if there is a Boto3 script it would look like this shown below, but I have not been able to find documentation/examples for it.
Example (Pseudo Code)
table = dynamodb.Table(‘my_table’)
table.BatchWriteItem(RequestItems=file://MyData.json)
Reference: http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchWriteItem.html
Pointers appreciated.
DynamoDB Data Type Conversions to JSON Specifically: DynamoDB sets (the SS , NS , and BS types) will be converted to JSON arrays. DynamoDB binary scalars and sets (the B and BS types) will be converted to base64-encoded JSON strings or lists of strings.
Connecting AWS resources to the python environment requires a boto3 package. Creating a dynamo DB client is a connection instance that lets us connect with our dynamo DB service. We need to specify region_name , aws_access_key_id , aws_secret_access_key in order to connect with our dynamoDb service.
The best place to look would be Boto3's readthedocs here: https://boto3.readthedocs.org/en/latest/reference/services/dynamodb.html#DynamoDB.Client.batch_write_item
As long as your JSON was formatted correctly for the request as in the example you could use:
f = open('MyData.json')
request_items = json.loads(f.read())
client = boto3.client('dynamodb')
response = client.batch_write_item(RequestItems=request_items)
I loaded the JSON this way
import boto3
import json
dynamodbclient=boto3.resource('dynamodb')
sample_table = dynamodbclient.Table('ec2metadata')
with open('/samplepath/spotec2interruptionevent.json', 'r') as myfile:
data=myfile.read()
# parse file
obj = json.loads(data)
#instance_id and cluster_id is the Key in dynamodb table
response=sample_table.put_item(
Item={
'instance_id': instanceId,
'cluster_id': clusterId,
'event':obj
}
)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With