Can someone please explain to me in a simple way why constants don't matter when it comes to big O notation? Why does the complexity stay the same when you add a constant. This is not a homework question I just want to understand this better. Let me get this straight big O is for seeing the behavior of a function as it approaches infinity right?
I got it. Thanks so much everyone.
It does not matter for complexity theory, which is interested only in how the function scales as the input size grows.
A constant does not affect how the function behaves as input sizes grow towards infinity at all.
However, if you are interested in actually running a certain piece of code, you may very well be interested in a large constant overhead and how the function performs for smaller input sizes.
Difference between complexity theory and practice.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With