The transitive closure of a graph is defined e. g. here: http://mathworld.wolfram.com/TransitiveClosure.html
It is easily possible in O(n^3), where n is the number of vertices. I was wondering if it can be done in time O(n^2).
Explanation: Transitive closure of a graph can be computed by using Floyd Warshall algorithm. This method involves substitution of logical operations (logical OR and logical AND) for arithmetic operations min and + in Floyd Warshall Algorithm. Transitive closure: tij(k)= tij(k-1) OR (tik(k-1) AND tkj(k-1)).
The time complexity of computing the transitive closure of a binary relation on a set of n elements is O(n3).
Warshall's algorithm is used to determine the transitive closure of a directed graph or all paths in a directed graph by using the adjacency matrix. For this, it generates a sequence of n matrices. Where, n is used to describe the number of vertices. A sequence of vertices is used to define a path in a simple graph.
To find the matrix CLOSURE of the transitive closure of a relation R whose n × n matrix representation is MAT. more efficient than the matrix multiplication method. If R and S are equivalence relations on a set A, then the smallest equivalence relation containing both R and S is (R ∪ S)∞.
Nope. I don't think there is a O(n2) algorithm for it. I would expect if such an algorithm existed, you could solve all pair shortest paths problem in O(n2) too, which is not the case. The asymptotically fastest algorithm I can think of is an implementation of Dijkstra's shortest path algorithm with a Fibonacci heap (O(n2log n) in not very dense graphs).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With