This is the first time I was tying to use AWS S3 for media storages. The application is hosted in Heroku, for static files it has not been a problem so I do not want to change static files, but want the applications users to upload files and images which I wish to store in S3. I have already spent 2-3 days so far now and no proper solution was found as I get 400 exception without a proper reason. Here is the documentation which I referred to: http://tech.marksblogg.com/file-uploads-amazon-s3-django.html So, my settings now:
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_S3_ACCESS_KEY_ID='dummyid'
AWS_S3_SECRET_ACCESS_KEY='dummykey'
AWS_STORAGE_BUCKET_NAME='dummyname'
AWS_QUERYSTRING_AUTH = False
AWS_HEADERS = {'Cache-Control': 'max-age=86400', }
MEDIAFILES_LOCATION = 'media'
MEDIA_URL = 'http://%s.s3.amazonaws.com/media/' % AWS_STORAGE_BUCKET_NAME
My model:
class DummyDocuments(models.Model):
document = models.FileField(upload_to='documents')
My form:
class DummyUploadForm(forms.Form):
documents = forms.FileField(widget=forms.ClearableFileInput(attrs={'multiple': True}))
And here is the view, where I am using it:
def upload(request):
if request.method == 'POST':
form = DummyUploadForm(request.POST, request.FILES)
if form.is_valid():
files = request.FILES.getlist('documents')
for file in files:
instance = DummyDocuments(document=file)
instance.save()
return redirect('activation_upload')
else:
form = DummyUploadForm()
documents = DummyDocuments.objects.all()
return render(request, 'activation/dummyupload.html', {'form': form, 'documents': documents})
Here is my CORS config on AWS:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<AllowedMethod>POST</AllowedMethod>
<AllowedMethod>PUT</AllowedMethod>
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>
Here is the exception which I am getting:
May I know what exactly is wrong?
All this django storage libs are quiet undocumented. I went through a lot of different articles to figure this out. You need this setting:
AWS_S3_HOST = 's3.ca-central-1.amazonaws.com'
and, ~/.boto
config with the contents:
[Credentials]
aws_access_key_id=yourid
aws_secret_access_key=yourkey
[s3]
host=s3.ca-central-1.amazonaws.com
I believe this is caused (assuming that all of the credentials and other setup are correct, but it's throwing a 400 error rather than a 403) by this issue.
As a result, the two options seem to be:
Add a file ~/.boto and put in the following, per Martin's comment on that thread.:
[s3]
host=s3.eu-central-1.amazonaws.com
Substituting for your appropriate region. You can create the file either by touch ~/.boto
and editing it from there, or by simply doing nano ~/.boto
and saving it.
Actually, what worked for me was creating a new bucket in a different region. It seems like the boto package has issues connecting to s3 buckets in specific regions. My settings.py file
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'your-access-key'
AWS_SECRET_ACCESS_KEY = 'your-secret-access-key'
AWS_STORAGE_BUCKET_NAME = 'mybucketname'
No boto file needed when the host region was s3-website-us-west-2.amazonaws.com
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With