Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Trouble with duplicate lines of logs in log file

Tags:

python

I've been trying to get this working correctly all day, its nearly complete just a strange issue I'm getting. Every result that is found in the search query gets logged as expected but the first result gets logged once, the second gets logged twice, the third gets logged three times etc.

Any ideas how to get rid of the duplicates? Example of logs

#!/usr/bin/python
import urllib
import simplejson 
import logging
from logging.handlers import SysLogHandler

query = urllib.urlencode({'q' : 'test'})
url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' \
      % (query)
search_results = urllib.urlopen(url)
json = simplejson.loads(search_results.read())
results = json['responseData']['results']
for i in results:
    logger = logging.getLogger()
    logger.addHandler(SysLogHandler(address=('192.168.0.2', 514)))
    logger.addHandler(logging.FileHandler("hits.log"))
    logging.warn(i['url'])
    print i['url']
like image 323
H20 Avatar asked Dec 22 '22 03:12

H20


1 Answers

Because you're adding a new handler each time in the for loop. Do this outside the loop, then only do the actual logging.warn inside the loop.

like image 189
Daniel Roseman Avatar answered Jan 06 '23 01:01

Daniel Roseman