I am trying to find the sum for series : 1 − 1 / 2 + 1 / 3 − 1 / 4 + · · · + 1 / 99 − 1 / 100 ** 2
with python.
My code is -
psum = 0
nsum = 0
for k in range(1,100):
if k%2 == 0:
nsum += 1.0/k
else:
psum += 1.0/k
print psum - nsum - 1.0/100**2
The output is 0.69807217931
I don't have the answer and just want to verify if I am doing it right.
This is not a homework question but just random Python practice.
That works fine, but why not just use one "summing" variable (call it total
, as a matter of good practice, since there is a built-in called sum
which you don't really want to hide), and actually add or subtract from it at each step?
Alternately (pun intended!), actually use that sum
function. The range
function can be used to skip every other number, too.
>>> sum(1.0/k for k in range(1, 100, 2)) - sum(1.0/k for k in range(2, 100, 2)) - (1.0/100**2)
0.6980721793101952
Or, as steveha shows, you can use logic to sort out whether to add or subtract the number based on whether it's divisible by 2, and handle it with a "weighted" sum (adding 1.0/k or -1.0/k as appropriate). This is why you should learn more math as a programmer :)
Well, I believe
import math
print math.log(2)
would do the trick.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With