What I mean is this: To know powers, I need to know multiplication, and to know multiplication, I need to know addition. So to know A I need to know B, or A depends on B. I can only think of a couple of rules: if A depends on B, B cannot depend on A. And if A depends on B, and B depends on C, C cannot depend on A.
Does this kind of data structure has a name? I don't think is a hierarchical tree. And also, am I missing any other rule? If I would like to implement a map of human knowledge in such a way that if I ask my database what I need to know to learn quantum physics, it gives me an ordered list of subjects on which quantum physics depends. Of course, this list could have some sublists that run in parallel, in the sense that A could depend on B and C, without B depending on C or C depending on B. In this case B would be parallel to C, so graphically they could be displayed bellow A but both at the same height.
I'm pretty sure there are many other cases on which the same kind of structure is used.
Edit How about a partially ordered set? Sorry, not trying to be picky but sounds to me like it formalizes the same thing without any unnecessary references to graphs.
Such dependency constraints are usually represented by a directed acyclic graph, or DAG for short.
A DAG is a graph which is
directed
...since each edge represents a dependency, and a dependency has a direction. If "A depends on B" you have A → B.
acyclic
...since (as you point out in your post) it is undesirable to have cyclic dependencies.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With