Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to upload a folder to aws s3 recursivly using ansible

I'm using ansible to deploy my application. I'm came to the point where I want to upload my grunted assets to a newly created bucket, here is what I have done: {{hostvars.localhost.public_bucket}} is the bucket name, {{client}}/{{version_id}}/assets/admin is the path to a folder containing Multi-levels folders and assets to upload:

- s3:
    aws_access_key: "{{ lookup('env','AWS_ACCESS_KEY_ID') }}"
    aws_secret_key: "{{ lookup('env','AWS_SECRET_ACCESS_KEY') }}"
    bucket: "{{hostvars.localhost.public_bucket}}"
    object: "{{client}}/{{version_id}}/assets/admin"
    src: "{{trunk}}/public/assets/admin"
    mode: put

Here is the error message:

   fatal: [x.y.z.t]: FAILED! => {"changed": false, "failed": true, "invocation": {"module_name": "s3"}, "module_stderr": "", "module_stdout": "\r\nTraceback (most recent call last):\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 2868, in <module>\r\n    main()\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 561, in main\r\n    upload_s3file(module, s3, bucket, obj, src, expiry, metadata, encrypt, headers)\r\n  File \"/home/ubuntu/.ansible/tmp/ansible-tmp-1468581761.67-193149771659393/s3\", line 307, in upload_s3file\r\n    key.set_contents_from_filename(src, encrypt_key=encrypt, headers=headers)\r\n  File \"/usr/local/lib/python2.7/dist-packages/boto/s3/key.py\", line 1358, in set_contents_from_filename\r\n    with open(filename, 'rb') as fp:\r\nIOError: [Errno 21] Is a directory: '/home/abcd/efgh/public/assets/admin'\r\n", "msg": "MODULE FAILURE", "parsed": false}

I went through the documentation and I didn't find recursing option for ansible s3_module. Is this a bug or am I missing something?!

like image 390
another geek Avatar asked Jul 15 '16 13:07

another geek


People also ask

Can we upload multiple files to S3?

Upload multiple files to AWS CloudShell using Amazon S3 You have two options for uploading files: AWS Management Console: Use drag-and-drop to upload files and folders to a bucket. AWS CLI: With the version of the tool installed on your local machine, use the command line to upload files and folders to the bucket.

Can you have folders in S3?

In Amazon S3, folders are used to group objects and organize files. Unlike a traditional file system, Amazon S3 doesn't use hierarchy to organize its objects and files. Amazon S3 console supports the folder concept only as a means of grouping (and displaying) objects.


2 Answers

As of Ansible 2.3, you can use: s3_sync:

- name: basic upload
  s3_sync:
    bucket: tedder
    file_root: roles/s3/files/

Note: If you're using a non-default region, you should set region explicitly, otherwise you get a somewhat obscure error along the lines of: An error occurred (400) when calling the HeadObject operation: Bad Request An error occurred (400) when calling the HeadObject operation: Bad Request

Here's a complete playbook matching what you were trying to do above:

- hosts: localhost
  vars:
    aws_access_key: "{{ lookup('env','AWS_ACCESS_KEY_ID') }}"
    aws_secret_key: "{{ lookup('env','AWS_SECRET_ACCESS_KEY') }}"    
    bucket: "{{hostvars.localhost.public_bucket}}"
  tasks:
  - name: Upload files
    s3_sync:
      aws_access_key: '{{aws_access_key}}'
      aws_secret_key: '{{aws_secret_key}}'
      bucket: '{{bucket}}'
      file_root: "{{trunk}}/public/assets/admin"
      key_prefix: "{{client}}/{{version_id}}/assets/admin"
      permission: public-read
      region: eu-central-1

Notes:

  1. You could probably remove region, I just added it to exemplify my point above
  2. I've just added the keys to be explicit. You can (and probably should) use environment variables for this:

From the docs:

If parameters are not set within the module, the following environment variables can be used in decreasing order of precedence AWS_URL or EC2_URL, AWS_ACCESS_KEY_ID or AWS_ACCESS_KEY or EC2_ACCESS_KEY, AWS_SECRET_ACCESS_KEY or AWS_SECRET_KEY or EC2_SECRET_KEY, AWS_SECURITY_TOKEN or EC2_SECURITY_TOKEN, AWS_REGION or EC2_REGION

like image 95
toast38coza Avatar answered Sep 30 '22 03:09

toast38coza


The ansible s3 module does not support directory uploads, or any recursion. For this tasks, I'd recommend using s3cmd check below syntax.

command: "aws s3 cp {{client}}/{{version_id}}/assets/admin s3://{{hostvars.localhost.public_bucket}}/ --recursive"
like image 42
error2007s Avatar answered Sep 30 '22 04:09

error2007s