I've been working on a WebGL application that requires a tremendous amount of point data to draw to the screen. Currently, that point and surface data is stored on the webserver I am using, and ALL 120MB of JSON is being downloaded by the browser on page load. This takes well over a minute on my network, which is not optimal. I was wondering if anyone has any experience/tips about loading data this large. I've tried eliminating as much whitespace as possible, but that barely made a dent in file size.
Is there any way to either compress this file immensely, or otherwise a better way to download such a large amount of data? Any help would be great!
JSON is incredibly redundant so it compresses well, which is then decompressed on the client.
JavaScript implementation of Gzip
Alternatively you could chunk the data up into 1 MB chunks to be sent over one at a time. Also the user probably can't interact with 120 MB of data at a time, so maybe implement some sort of level of detail system?
If you control the web server sending the data, you could try enabling compression of json data.
This is done by adding the following in applicationhost.config
(IIS 7):
<system.webServer>
<urlCompression doDynamicCompression="true" />
<httpCompression>
<dynamicTypes>
<add mimeType="application/json" enabled="true" />
<add mimeType="application/json; charset=utf-8" enabled="true" />
</dynamicTypes>
</httpCompression>
</system.webServer>
You would then need to restart the App pool for your app.
Source
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With