I have a 10mb+ binary file that is necessary for my client to work. Every time my client accesses the site, that file is a little bigger. So, consider my client accesses the page 20 times in a day. As with every new access the file has changed a little, it can't be cached, so he will download at least 200mb - even if, during the day, the file only changed 0.1mb.
Is there any way to avoid that gigantic waste of bandwidth?
On the server you could divide the file into chunks, and have ajax download the chunks and assemble them into the browser as a single file, and then check with the server occasionally to see which file chunks need updating, and patch the file that's in the browser with the updated chunks. Essentially a simple implementation of rsync in the browser.
Handling binary data is tricky with javascript so you might find these libraries and code links handy:
https://github.com/jDataView/jDataView/
http://www.html5rocks.com/en/tutorials/webgl/typed_arrays/
https://gist.github.com/fbuchinger/674212
See "Ox.getChunked" method as that might allow you to just request individual ranges of the file directly from the server.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With