Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Using large JSON files with d3.js causes massive performance hits/crashes

So I currently have a massive JSON file that is about 90mb in size and about 3/4 million lines. I am trying to create a graph from it using the d3.json command. d3.json successfully produces the data, and I can render the graph, but there exists one node in my tree that has in excess of probably 500 children. This causes Chrome to crash and Firefox to grind to a halt, but not crash, giving me the opportunity to close the node and regain performance.

According to this stackoverflow article (d3 — Progressively draw a large dataset), I could use this to progressively draw the dataset. Can this be done for JSON with a more intelligent splicing? However, would the end result not be the same as in Firefox?

Is there any way that I could create a paging system for the child nodes? Is there a viable solution here other than simply not displaying that many child nodes?

Thanks in advance.

like image 927
Jake Avatar asked Nov 09 '22 20:11

Jake


1 Answers

I resolved that the issue came from the animation and drawing done by d3, so I ended up creating pseudo-folders within the JSON to minimize the amount of nodes displayed.

So, instead of trying to expand 26154 nodes at once, I resolved to expand 104 folders which each contained 250ish nodes.

Hope this helps anyone encountering the same problem.

like image 126
Jake Avatar answered Nov 14 '22 21:11

Jake