Boto provides a simple way to upload a file to Amazon S3:
conn = boto.connect_s3(settings.AWS_ACCESS_KEY, settings.AWS_SECRET_KEY)
bucket = conn.get_bucket(bucket) # my 'folder'
key = bucket.new_key(s3_filename) # my 'filename'
key.set_contents_from_filename('myfile.txt')
Is there an equivalent way to do the same for Google Cloud Storage with python? All the examples I've seen have involved 50+ lines of code to do a 'simple' operation, like upload a local file.
How could this be done (from the CLI)?
yes, It's available, please refer below docs: https://cloud.google.com/storage/docs/gsutil_install#sdk-install. https://cloud.google.com/compute/docs/tutorials/python-guide. https://cloud.google.com/storage/docs/boto-gsutil.
For program use via Python, the boto library and gcs-oauth2-boto-plugin let you use essentially the same code for interacting with GCS, as you can use for S3 (or presumably other cloud storage services with the right plugins).
See https://cloud.google.com/storage/docs/gspythonlibrary for instructions on how to download and install the library and plugin, and several short example snippets of Python code to perform various operations. Other detailed instructions are at https://cloud.google.com/storage/docs/gsutil_install#boto .
CLI usage is even simpler; once you've installed and authenticated gsutil
, for example, a typical "upload a bunch of files" CLI command is
$ gsutil cp *.txt gs://my_bucket
(using $
to mean "the shell prompt":-).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With