UPDATE: I have created a JSFiddle here. Please post an updated fiddle with your answer.
I have dynamic filters that the user can apply to data but they change the opacity of the nodes to indicate what is filtered in and out (the filtered "out" elements are still partially visible and the actual d3 filter()
function is not used (intentionally)). I also set a property on each node that is filtered out (e.g. node = {"name": "test", "isFilteredOut": true};
). So for the purposes of this question, even though I am using the word "filter", it is really just a conditional style change (and I will try to put the word "filter" in quotes in this post as a reminder of this).
This all works fine, but now I want to recursively "filter" out all children nodes and edges of "filtered-out" nodes, and also the edge connecting the initial "filtered-out" node to its non-filtered-out parent node.
All of the examples I can find begin with a click event and thus have the luxury of using this
to get the data for the initial node selected. I do not have this luxury because the filter is applied using a UI element that is not within the graph itself.
I currently "filter" the nodes like so:
node.style("opacity", function(n) {
if (my_filter_conditions) {
return 1;
} else {
n.isFilteredOut = true;
return 0.1;
}
});
What I essentially need to do is:
Recursively select all children nodes of currently "filtered-out" nodes and "filter" those out also (i.e. change their opacity to 0.1 and set n.isFilteredOut = true;
).
Change the opacity of all edges to 0.1 where the source node or target node are "filtered out" (i.e. n.isFilteredOut = true;
on either end of the edge)
I don't know how to access the data of the source and target nodes given only the index of each from the edges (remember I have no this
node to start with from a click event). I tried passing the node index obtained from the edge to obtain the node data with:
var node_data = d3.select(current_edge.source.index).datum();
However, this resulted in errors from the d3 library related to this.node()
being null (so passing the index here did not work).
I also tried handling edges by nesting the function for handling links inside the function passed to the node.style()
function but then it tries to deal with all edges on every node and I can't get it to provide the desired result.
link.style("opacity", function (e) {
return ( (n.isFilteredOut)
&& (n.index==e.source.index | n.index==e.target.index) ) ? 0.1 : 1;
});
This was my attempt to "filter out" edges on both sides of "filtered-out" nodes, but none of the edges were ever filtered out when I used this for some reason (it appeared as if nothing happened at all).
UPDATE: I have created a JSFiddle here.
Notes on the fiddle:
node.isFilteredOut = true;
dataSet
itself will not work, because much of my data is dynamically populated from various JSON sources. Feel free to work with nodes
, edges
, links
, node
, and/or link
.eval()
statements aren't great. But this is not a question about how to best apply infinite conjunctive filters, but about recursively changing the opacity of nodes and edges based on the filters appliedHere is a possible approach, that implements both the recursive filtering (if a device is filtered, its parts are filtered), and the filtering of links based on the filtering of their nodes: http://jsfiddle.net/Lsr9c8nL/4/
I have changed the way you implement the filter. Using strings to build the filter, and then eval()
is considered pretty bad these days, because tools cannot do much with eval()
, for instance to detect errors or optimize the JS code on the browser.
I do the filtering directly on the dataSet
, not on the nodes (where you had to query the node's type
and compare strings, which is slow). Doing so directly on the dataSet also allows easily finding what is the device, for a given part.
The trick is basically to redraw the whole diagram every time, and carefully use d3's exit
, enter
and update
selections. This will also allow you to add animations if you want
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With