I have the special case of the problem, but it would be nice to know whether it is possible for any function.
So I want to find the position of a substring in a string. Ok, in python there is a find method which does exactly what is needed.
string.find(s, sub[, start[, end]])
Return the lowest index in s where the substring sub is found such that sub is wholly contained in s[start:end]. Return -1 on failure. Defaults for start and end and interpretation of negative values is the same as for slices.
Amazing, but the problem is that finding a big substring in a big string can run from O(n*m)
to O(n)
(which is a huge deal) depending on the algorithm. Documentation gives no information about time complexity, nor information about the underlying algorithm.
I see few approaches how to resolve this:
Both does not sound really easy (I hope that there is an easier way). So how can I find a complexity of a built-in function?
Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. When analyzing the time complexity of an algorithm we may find three cases: best-case, average-case and worst-case.
The complexity is O(MN).
Creating Set:- In Python, Sets are created through set() function. An Empty list is created. Note that empty Set cannot be created through {}, it creates dictionary. Checking if an item is in : Time complexity of this operation is O(1) on average.
The runtime complexity of the len() function on your Python list is O(1). It takes constant runtime no matter how many elements are in the list.
You say, "go to source code and try to understand it," but it might be easier than you think. Once you get to the actual implementation code, in Objects/stringlib/fastsearch.h, you find:
/* fast search/count implementation, based on a mix between boyer-
moore and horspool, with a few more bells and whistles on the top.
for some more background, see: http://effbot.org/zone/stringlib.htm */
The URL referenced there has a good discussion of the algorithm and its complexity.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With