I've been trying to get this working correctly all day, its nearly complete just a strange issue I'm getting. Every result that is found in the search query gets logged as expected but the first result gets logged once, the second gets logged twice, the third gets logged three times etc.
Any ideas how to get rid of the duplicates? Example of logs
#!/usr/bin/python
import urllib
import simplejson
import logging
from logging.handlers import SysLogHandler
query = urllib.urlencode({'q' : 'test'})
url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' \
% (query)
search_results = urllib.urlopen(url)
json = simplejson.loads(search_results.read())
results = json['responseData']['results']
for i in results:
logger = logging.getLogger()
logger.addHandler(SysLogHandler(address=('192.168.0.2', 514)))
logger.addHandler(logging.FileHandler("hits.log"))
logging.warn(i['url'])
print i['url']
Because you're adding a new handler each time in the for loop. Do this outside the loop, then only do the actual logging.warn
inside the loop.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With