Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pointing to multiple S3 buckets in s3boto

In settings.py I have:

STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'xxxxxxxxxxxxx'
AWS_SECRET_ACCESS_KEY = 'xxxxxxxxxxxxx'
AWS_STORAGE_BUCKET_NAME = 'static.mysite.com'

This is pointing to my S3 bucket static.mysite.com and works fine when I do manage.py collectstatic, it uploads all the static files to my bucket. However, I have another bucket which I use for different purposes and would like to use in certain areas of the website, for example if I have a model like this:

class Image(models.Model):
    myobject = models.ImageField(upload_to='my/folder')

Now when Image.save() is invoked, it will still upload the file to the S3 bucket in AWS_STORAGE_BUCKET_NAME, however I want this Image.save() to be point to another S3 bucket. Any clean way of doing this? I don't want to change settings.py in run time nor implement any practices that violate the key principles of django, i.e. having a pluggable easy-to-change backend storage.

like image 487
Aziz Alfoudari Avatar asked Feb 07 '12 10:02

Aziz Alfoudari


People also ask

Is it better to have multiple S3 buckets or one bucket with sub folders?

The total volume of data and number of objects you can store are unlimited. Also the documentation states there is no performance difference between using a single bucket or multiple buckets so I guess both option 1 and 2 would be suitable for you.

Can you have multiple S3 buckets?

Amazon S3 Replication now gives you the ability to replicate data from one source bucket to multiple destination buckets. With S3 Replication (multi-destination) you can replicate data in the same AWS Regions using S3 SRR or across different AWS Regions by using S3 CRR, or a combination of both.


1 Answers

The cleanest way for you would be to create a subclass of S3BotoStorage, and override default bucket name in the init method.

from django.conf import settings
from storages.backends.s3boto import S3BotoStorage

class MyS3Storage(S3BotoStorage):
    def __init__(self, *args, **kwargs):
        kwargs['bucket'] = getattr(settings, 'MY_AWS_STORAGE_BUCKET_NAME')
        super(MyS3Storage, self).__init__(*args, **kwargs)

Then specify this class as your DEFAULT_FILE_STORAGE and leave STATICFILES_STORAGE as it is, or vise versa.

like image 95
Andrew Kurinnyi Avatar answered Sep 29 '22 18:09

Andrew Kurinnyi