I have a django app that allows users to upload videos. Its hosted on Heroku and the uploaded files stored on an S3 Bucket. I am using JavaScript to directly upload the files to S3 after obtaining a presigned request from Django app. This is due to Heroku 30s request timeout. Is there anyway that i can possibly upload large files through Django backend without using JavaScript and compromising the user experience?
You should consider some of the points below for the solution of your problem.
The points in the other answer are valid. The short answer to the question of "Is there anyway that i can possibly upload large files through Django backend without using JavaScript" is "not without switching away from Heroku".
Keep in mind that any data transmitted to your dynos goes through Heroku's routing mesh, which is what enforces the 30 second request limit to conserve its own finite resources. Long-running transactions of any kind use up bandwidth/compute/etc that could be used to serve other requests, so Heroku applies the limit to help keep things moving across the thousands of dynos. When uploading a file, you will first be constrained by client bandwidth to your server. Then, you will be constrained by the bandwidth between your dynos and S3, on top of any processing your dyno actually does.
The larger the file, the more likely it will be that transmitting the data will exceed the 30 second timeout, particularly in step 1 for clients on unreliable networks. Creating a direct path from client to S3 is a reasonable compromise.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With