Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to handle loading of LARGE JSON files

I've been working on a WebGL application that requires a tremendous amount of point data to draw to the screen. Currently, that point and surface data is stored on the webserver I am using, and ALL 120MB of JSON is being downloaded by the browser on page load. This takes well over a minute on my network, which is not optimal. I was wondering if anyone has any experience/tips about loading data this large. I've tried eliminating as much whitespace as possible, but that barely made a dent in file size.

Is there any way to either compress this file immensely, or otherwise a better way to download such a large amount of data? Any help would be great!

like image 571
a10y Avatar asked Aug 24 '12 23:08

a10y


2 Answers

JSON is incredibly redundant so it compresses well, which is then decompressed on the client.

JavaScript implementation of Gzip

Alternatively you could chunk the data up into 1 MB chunks to be sent over one at a time. Also the user probably can't interact with 120 MB of data at a time, so maybe implement some sort of level of detail system?

like image 51
John Simon Avatar answered Sep 23 '22 18:09

John Simon


If you control the web server sending the data, you could try enabling compression of json data.

This is done by adding the following in applicationhost.config (IIS 7):

<system.webServer>
    <urlCompression doDynamicCompression="true" />
    <httpCompression>
      <dynamicTypes>
        <add mimeType="application/json" enabled="true" />
        <add mimeType="application/json; charset=utf-8" enabled="true" />        
      </dynamicTypes>
    </httpCompression>
</system.webServer>

You would then need to restart the App pool for your app.

Source

like image 22
Icarus Avatar answered Sep 21 '22 18:09

Icarus