To check for odd and even integer, is the lowest bit checking more efficient than using the modulo?
>>> def isodd(num): return num & 1 and True or False >>> isodd(10) False >>> isodd(9) True
Definition of is (Entry 1 of 4) present tense third-person singular of be. dialectal present tense first-person and third-person singular of be. dialectal present tense plural of be.
Yes, "is" is a linking verb. Linking verbs typically link subjects to descriptions. Ex: The car is blue.
Definition of in use : being used All of the computers are currently in use.
The word “is” is always used as a verb in written and spoken English. This word is considered as a verb because it expresses existence or a state of being. It is classified under linking verbs and is a derivative of the verb “to be.” In the sample sentence: He is the most intelligent student in class.
Yep. The timeit
module in the standard library is how you check on those things. E.g:
$ python -m timeit -s 'def isodd(x): x & 1' 'isodd(9)' 1000000 loops, best of 3: 0.446 usec per loop $ python -m timeit -s 'def isodd(x): x & 1' 'isodd(10)' 1000000 loops, best of 3: 0.443 usec per loop $ python -m timeit -s 'def isodd(x): x % 2' 'isodd(9)' 1000000 loops, best of 3: 0.461 usec per loop $ python -m timeit -s 'def isodd(x): x % 2' 'isodd(10)' 1000000 loops, best of 3: 0.453 usec per loop
As you see, on my (first-day==old==slow;-) Macbook Air, the &
solution is repeatably between 7 and 18 nanoseconds faster than the %
solution.
timeit
not only tells you what's faster, but by how much (just run the tests a few times), which usually shows how supremely UNimportant it is (do you really care about 10 nanoseconds' difference, when the overhead of calling the function is around 400?!-)...
Convincing programmers that micro-optimizations are essentially irrelevant has proven to be an impossible task -- even though it's been 35 years (over which computers have gotten orders of magnitude faster!) since Knuth wrote
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
which as he explained is a quote from an even older statement from Hoare. I guess everybody's totally convinced that THEIR case falls in the remaining 3%!
So instead of endlessly repeating "it doesn't matter", we (Tim Peters in particular deserves the honors there) put in the standard Python library module timeit
, that makes it trivially easy to measure such micro-benchmarks and thereby lets at least some programmers convince themselves that, hmmm, this case DOES fall in the 97% group!-)
To be totally honest, I don't think it matters.
The first issue is readability. What makes more sense to other developers? I, personally, would expect a modulo when checking the evenness/oddness of a number. I would expect that most other developers would expect the same thing. By introducing a different, and unexpected, method, you might make code reading, and therefore maintenance, more difficult.
The second is just a fact that you probably won't ever have a bottleneck when doing either operation. I'm for optimization, but early optimization is the worst thing you can do in any language or environment. If, for some reason, determining if a number is even or odd is a bottleneck, then find the fastest way of solving the problem. However, this brings me back to my first point - the first time you write a routine, it should be written in the most readable way possible.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With