EDIT:
I have found that removing import django_heroku
from my settings.py file allows me to push my static files to my AWS bucket. When I uncomment import django_heroku
, collectstatic
then pushes my files to the staticfiles
folder.
manage.py collectstatic
with #import django_heroku
:
You have requested to collect static files at the destination location as specified in your settings.
maange.py collectstatic
with import django_heroku
:
You have requested to collect static files at the destination location as specified in your settings: /path/to/project/staticfiles
I don't know why this is or how to fix it. Now the question is: How can I run collectstatic for my django app on Heroku? Or do I need to run my heroku instance with nostatic? (e.g. runserver --nostatic
)
Problem:
Whenever I run python manage.py collectstatic
all my static files are placed into a local 'staticfiles' folder. I have set STATICFILES_STORAGE = myapp.aws.utils.StaticRootS3Boto3Storage
however the files always go to 'myapp/staticfiles'.
Settings.py:
AWS_ACCESS_KEY_ID = config('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = config('AWS_SECRET_ACCESS_KEY')
AWS_FILE_EXPIRE = 200
AWS_PRELOAD_METADATA = True
AWS_QUERYSTRING_AUTH = True
DEFAULT_FILE_STORAGE = 'myapp.aws.utils.MediaRootS3Boto3Storage'
STATICFILES_STORAGE = 'myapp.aws.utils.StaticRootS3Boto3Storage'
AWS_STORAGE_BUCKET_NAME = config('AWS_STORAGE_BUCKET_NAME')
S3DIRECT_REGION='us-east-1'
AWS_S3_URL = '//s3.amazonaws.com/%s' % AWS_STORAGE_BUCKET_NAME
MEDIA_URL = 'http://s3.amazonaws.com/%s/media/' % AWS_STORAGE_BUCKET_NAME
MEDIA_ROOT = MEDIA_URL
STATIC_URL = AWS_S3_URL + '/static/'
STATICFILES_LOCATION = STATIC_URL
MEDIAFILES_LOCATION = 'media'
ADMIN_MEDIA_PREFIX = STATIC_URL + 'admin/'
AWS_HEADERS = {
'Expires': 'Thu, 31 Dec 2099 20:00:00 GMT',
'CacheControl': 'max-age=94608000',
}
myapp / aws / utils.py:
from storages.backends.s3boto3 import S3Boto3Storage
StaticRootS3Boto3Storage = lambda: S3Boto3Storage(location='static')
MediaRootS3Boto3Storage = lambda: S3Boto3Storage(location='media')
Project Structure:
myproject/
|-- myapp/
| |-- aws/
| | |-- __init__.py
| | |-- utils.py
| |-- __init__.py
| |-- settings.py
| |-- urls.py
| +-- wsgi.py
+-- manage.py
|-- static/
| |-- scss
| |-- css
| |-- js
| |-- images
|-- staticfiles/
| |-- **ALL STATIC FILES END UP HERE**
Notes:
STATICFILES_STORAGE
seems to have no effect on the outputpython manage.py collectstatic
results in:
You have requested to collect static files at the destination
location as specified in your settings:
/Users/nickmancini/Development/myapp/staticfiles
I do not get any error messages. All files are successfully created in the local staticfiles folder.
Media files are successfully uploaded to my S3 bucket, but static files always go /Users/me/myproject/myapp/staticfiles
According to the docs, after setting STATICFILES_STORAGES "all you have to do is run collectstatic and your static files would be pushed through your storage package up to S3"
Questions:
Resources I've Consulted:
The situation in which it happens is just like reported above: if the folder being sync ed into contains a file with different file contents but identical file size, sync will skip copying the new updated file from S3.
The S3 CLI isn't very sensitive to changes; it relies on the size of the file and the timestamp only. See aws/aws-cli#3273.
Build 2 has HTML files with a timestamp of 10:05, however the HTML files uploaded to s3 by build 1 have a timestamp of 10:06 as that's when the objects were created. This results in them being ignored by s3 sync as remote files are "newer" than local files. I'm now using s3 cp --recursive follow by s3 sync --delete as suggested earlier.
However, Storage Gateway doesn't automatically update the cache when you upload a file directly to Amazon S3. When you do this, you must perform a RefreshCache operation to see the changes on the file share.
For anyone facing the same problem, the solution for me was: django_heroku.settings(locals(), staticfiles=False)
which is also detailed at this github issue.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With