Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Mount s3fs as docker volume

So I just want to add my s3 bucket from amazon to my docker swarm. I've saw many "possible" solutions on the internet but I can't connect them to add the content of my bucket as volume.

So the last thing I've tried was the command statet here (Is s3fs not able to mount inside docker container?):

docker run --rm -t -i --privileged -e AWS_ACCESS_KEY_ID=XXXX -e AWS_SECRET_ACCESS_KEY=XXXX -e AWS_STORAGE_BUCKET_NAME=XXXX docker.io/panubo/s3fs bash

It's working pretty well but if I now exit bash the container stops and I can't do anything with it. Is it possible to make this thing to stay and add it as a volume?

Or would it be the better solution if I just mount the bucket on my Docker instance and then add it as a local volume? Would this be the better idea?

like image 812
thmspl Avatar asked Mar 20 '18 08:03

thmspl


1 Answers

I've made it!

The configuration looks like this:

docker-compose.yml

volumes:
  s3data:
    driver: local

services:
  s3vol:
    image: elementar/s3-volume
    command: /data s3://{BUCKET NAME}
    environment:
      - BACKUP_INTERVAL={INTERVALL IN MINUTES (2m)}
      - AWS_ACCESS_KEY_ID={KEY}
      - AWS_SECRET_ACCESS_KEY={SECRET}
    volumes:
      - s3data:/data

And after inserting this into the docker-compose file you can use the s3 storage as volume. Like this:

docker-compose.yml

linux:
  image: {IMAGE}
  volumes:
    - s3data:/data

Hope this helps some of you in the future!

Cheers.

like image 177
thmspl Avatar answered Oct 11 '22 22:10

thmspl