I am trying to access amazon s3 using boto library to access common crawl data availble in amazon 'aws-publicdatasets'.
i created access config file in ~/.boto
[Credentials]
aws_access_key_id = "my key"
aws_secret_access_key = "my_secret"
and while creating connection with amazon s3 i see below error in logs.
2014-01-23 16:28:16,318 boto [DEBUG]:Retrieving credentials from metadata server.
2014-01-23 16:28:17,321 boto [ERROR]:Caught exception reading instance data
Traceback (most recent call last):
File "/usr/lib/python2.6/site-packages/boto-2.13.3-py2.6.egg/boto/utils.py", line 211, in retry_url
r = opener.open(req)
File "/usr/lib64/python2.6/urllib2.py", line 391, in open
response = self._open(req, data)
File "/usr/lib64/python2.6/urllib2.py", line 409, in _open
'_open', req)
File "/usr/lib64/python2.6/urllib2.py", line 369, in _call_chain
result = func(*args)
File "/usr/lib64/python2.6/urllib2.py", line 1190, in http_open
return self.do_open(httplib.HTTPConnection, req)
File "/usr/lib64/python2.6/urllib2.py", line 1165, in do_open
raise URLError(err)
URLError: <urlopen error timed out>
2014-01-23 16:28:17,323 boto [ERROR]:Unable to read instance data, giving up
In other way I tried to give credentials while creating connection object also as shown below
from boto.s3.connection import S3Connection
from boto.s3.bucket import Bucket
boto.set_stream_logger('boto')
connection = S3Connection('______','__________')
bucket = Bucket(connection.get_bucket('aws-publicdatasets'))
Still i am seeing the same error in logs
I had the same error when using a .boto file in the same folder as the script. It has to be on the base of the home folder.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With