Calculating time complexity in Python is very easy by comparing the time it takes to run an algorithm vs the size of the input. We can do something like:
import time
start = time.time()
<Run the algorithm on input_n (input of size n)>
end = time.time()
time_n = end - start
By graphing time_n
vs input_n
, we can observe whether the time complexity is constant, linear, exponential, etc.
Is there a similarly empirical, programmatic way of calculating the space complexity of an algorithm in Python, where we can measure the amount of space used as the input size grows?
you can use memory_profiler
with a decorator like this:
from memory_profiler import profile
@profile(precision=4)
def func():
your function
there another function called mprof
in memory_profiler
that will be useful too. This can be useful if you want to see if your memory is getting cleaned up and released periodically. just run mprof run script script_args in your shell of choice. mprof will automatically create a graph of your script's memory usage over time, which you can view by running mprof plot. it requires matplotlib
though.
update: Thanks to @hunzter you can find documentation here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With