I have to upload >= 20 GBs of data with Django. Should I break the file into chunks and then upload it with some kind of a checksum to maintain integrity or does Django implicitly does it?
Will it be better if I use FTP instead of regular HTTP for such large files?
Possible solutions: 1) Configure maximum upload file size and memory limits for your server. 2) Upload large files in chunks. 3) Apply resumable file uploads. Chunking is the most commonly used method to avoid errors and increase speed.
Django provides built-in library and methods that help to upload a file to the server. The forms. FileField() method is used to create a file input and submit the file to the server. While working with files, make sure the HTML form tag contains enctype="multipart/form-data" property.
Django uses so-called Upload Handlers to upload files, and has a related setting called FILE_UPLOAD_MAX_MEMORY_SIZE (default value of 2.5Mb). Files smaller than this threshold will be handled in memory, larger files will be streamed into a temporary file on disk. I haven't yet tried uploading files larger than about 1Gb, but I would expect you can just use django without problems.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With