As a self-taught computer programmer, I'm often at a loss to estimate the O() value for a particular operation. Yeah, I know off the top of my head most of the important ones, like for the major sorts and searches, but I don't know how to calculate one when something new comes along, unless it's blindingly obvious. Is there a good web site or text that explains how to do that? Heck, I don't even know what computer scientists call it, so I can't google it.
It's called Big O Notation, and it's used in Computational Complexity Theory.
The wikipedia articles are a pretty good starting point, as are the bibliography at the bottom of the page.
Introduction to Algorithms is the standard text used at most universities. I've used it and can recommend those chapters on order analysis. I'd start with the articles in Tim Howland's answer, though.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With