I realize that runtime of BFS and DFS on a generic graph is O(n+m), where n is number of nodes and m is number of edges, and this is because for each node its adjacency list must be considered. However, what is the runtime of BFS and DFS when it is executed on a binary tree? I believe it should be O(n) because the possible number of edges that can go out of a node is constant (i.e., 2). Please confirm if this is the correct understanding. If not, then please explain what is the correct time complexity of BFS and DFS on a binary tree?
Time Complexity of BFS = O(V+E) where V is vertices and E is edges. Time Complexity of DFS is also O(V+E) where V is vertices and E is edges.
Both DFS and BFS have a runtime of O(V + E) and a space complexity of O(V) . Both algorithms have different philosophies but share equal importance in how we traverse graphs.
The time complexities for BFS and DFS are just O(|E|) , or in your case, O(m) . In a binary tree, m is equal to n-1 so the time complexity is equivalent to O(|V|) . m refers to the total number of edges, not the average number of adjacent edges per vertex.
Breadth-first search has a running time of O ( V + E ) O(V + E) O(V+E) since every vertex and every edge will be checked once.
The time complexities for BFS and DFS are just O(|E|)
, or in your case, O(m)
.
In a binary tree, m
is equal to n-1
so the time complexity is equivalent to O(|V|)
. m
refers to the total number of edges, not the average number of adjacent edges per vertex.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With