I'm working on a Python 3 script designed to get S3 space utilization statistics from AWS CloudFront using the Boto3 library.
I started with the AWS CLI and found I could get what I'm after with a command like this:
aws cloudwatch get-metric-statistics --metric-name BucketSizeBytes --namespace AWS/S3 --start-time 2017-03-06T00:00:00Z --end-time 2017-03-07T00:00:00Z --statistics Average --unit Bytes --region us-west-2 --dimensions Name=BucketName,Value=foo-bar Name=StorageType,Value=StandardStorage --period 86400 --output json
This returns the data I would expect. Now I'd like to do the same thing in Python 3 / Boto3. My code thusfar is:
from datetime import datetime, timedelta
import boto3
seconds_in_one_day = 86400 # used for granularity
cloudwatch = boto3.client('cloudwatch')
response = cloudwatch.get_metric_statistics(
Namespace='AWS/S3',
Dimensions=[
{
'Name': 'BucketName',
'Value': 'foo-bar'
},
{
'Name': 'StorageType',
'Value': 'StandardStorage'
}
],
MetricName='BucketSizeBytes',
StartTime=datetime.now() - timedelta(days=7),
EndTime=datetime.now(),
Period=seconds_in_one_day,
Statistics=[
'Average'
],
Unit='Bytes'
)
print(response)
When I run this, I get a valid response but no datapoints (it's an empty array). They seem to be identical except the Python method doesn't seem to have a place for the region, where the command line requires it.
One more thing I tried: my code is computing the dates for the last date versus the command line where they are hard coded. I did try hard coding the date just to see if I would get data back, and the result was the same.
So my questions are these:
Is the method I'm using in Boto / Python equivalent to the command line? Assuming they are, what could I be missing?
Here is one very good example to get data from cloudwatch in python using boto3. I had to spend few hours to get it working, but it should be easy to refer now.
def get_req_count(region, lb_name):
client = boto3.client('cloudwatch', region_name=region)
count = 0
response = client.get_metric_statistics(
Namespace="AWS/ApplicationELB",
MetricName="RequestCount",
Dimensions=[
{
"Name": "LoadBalancer",
"Value": lb_name
},
],
StartTime=str_yesterday,
EndTime=str_today,
Period=86460,
Statistics=[
"Sum",
]
)
#print(response2)
for r in response['Datapoints']:
count = (r['Sum'])
return count
This is what I've done:
client = boto3.client(service_name='cloudwatch', region_name='us-east-1')
response = client.get_metric_statistics(
Namespace = 'AWS/EC2',
Period = 300,
StartTime = datetime.utcnow() - timedelta(seconds = 600),
EndTime = datetime.utcnow(),
MetricName = metricVar,
Statistics=['Average'], Unit='Percent',
Dimensions = [
{'Name': 'InstanceId', 'Value': asgName}
])
I don't see anything obviously wrong with your code, so the region looks like a prime suspect here.
You can set it when creating the client with:
cloudwatch = boto3.client('cloudwatch', region_name='us-west-2')
If this is not set, boto will try to get the region from the AWS_DEFAULT_REGION
env variable first and then the ~/.aws/config
configuration file. Try checking those to see what is the default region set.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With