Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

D3JS scaling and transition performance

I have some code to scale and translate a map in D3 but the performance is quite terrible. When zooming and panning, it's taking nearly 3 seconds for a refresh. I thought the map would look nicer including line boundaries for all the counties, but at 6MB+ I suspect this may be where the bottleneck is coming from. Is there another way I should be handling the transforms or maybe a way to optimize the map data? Is D3 really not suited to this level of detail? Very new to D3. enter image description here

I'm using shape files from here, converted from DBF to Geojson using QGIS: https://www.census.gov/cgi-bin/geo/shapefiles2010/main

<!doctype html>
<html>

<head>
   <title>d3 map</title>
   <script src="http://d3js.org/d3.v3.min.js">
   </script>
</head>

<body>
   <script>
            var width = 800;
            var height = 600;

            var projection = d3.geo.mercator();
            var path = d3.geo.path().projection (projection);

            var canvas = d3.select ("body")
               .append ("svg")
               .attr ("width", width)
               .attr ("height", height)

            var zoomVar = d3.behavior.zoom()
               .translate(projection.translate())
               .scale(projection.scale())
               .scaleExtent([height, 60 * height])
               .on("zoom", onPostZoom);

            var hotbox = canvas.append("g").call(zoomVar);

            hotbox.append("rect")
               .attr("class", "background")
               .attr("width", width)
               .attr("fill", "white")
               .attr("height", height);     

            d3.json ("cali.geojson", function (data) 
            {
               hotbox.append("g")
                  .attr("id", "geometry")
                  .selectAll("path")
                  .data(data.features)
                     .enter()
                        .append("path")
                        .attr("d", path)
                        .attr("fill", "steelblue")
                        .on("click", onClick);

            })




function onClick (d) 
{
   var centroid = path.centroid(d), translate = projection.translate();

   projection.translate(
   [translate[0] - centroid[0] + width / 2,
    translate[1] - centroid[1] + height / 2 ]);

   zoomVar.translate(projection.translate());

   hotbox.selectAll("path").transition()
      .duration(700)
      .attr("d", path);

}     


function onPostZoom() 
{
  projection.translate(d3.event.translate).scale(d3.event.scale);
  hotbox.selectAll("path").attr("d", path);
}

</script>
</body>
</html>
like image 691
wufoo Avatar asked Jul 26 '13 18:07

wufoo


Video Answer


2 Answers

As Lars said, you should definitely simplify your data to the appropriate resolution. Choose the maximum resolution based on how far you want to zoom in. I recommend topojson -s to simplify, as you’ll also get the benefits of the smaller TopoJSON format.

The other big thing is to avoid reprojection if you’re just panning and zooming. Reprojection is a comparatively expensive trigonometric operation, and so is serializing very large path strings in SVG. You can avoid this for panning and zooming (translating and scaling) by simply setting the transform attribute on the path elements or a containing G element. See these examples:

  • Zoom to Bounding Box
  • Raster & Vector Zoom

You should also consider using projected TopoJSON (alternate example) which bakes the projection into the TopoJSON file. This makes the client is even faster: it never has to project!

like image 59
mbostock Avatar answered Oct 22 '22 18:10

mbostock


The problem you're experiencing isn't really because of D3, but because of the browser. The main bottleneck is rendering all the visual elements, not computing their positions etc.

The only way to avoid this is to have less data. One way to start would be to simplify the boundaries in QGIS, using e.g. the dpsimplify plugin.

like image 23
Lars Kotthoff Avatar answered Oct 22 '22 17:10

Lars Kotthoff