Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

feedparser with timeout

My code got stuck on this function call:

feedparser.parse("http://...")

This worked before. The url is even not possible to open in the browser. How would you cure this case? Is there a timeout possibility? I'd like to continue as if nothing would happen (only with printing some message or log this issue)

like image 313
xralf Avatar asked Mar 19 '12 15:03

xralf


2 Answers

Use Python requests library for network IO, feedparser for parsing only:

# Do request using requests library and timeout
try:
    resp = requests.get(rss_feed, timeout=20.0)
except requests.ReadTimeout:
    logger.warn("Timeout when reading RSS %s", rss_feed)
    return

# Put it to memory stream object universal feedparser
content = BytesIO(resp.content)

# Parse content
feed = feedparser.parse(content)
like image 53
Mikko Ohtamaa Avatar answered Sep 18 '22 22:09

Mikko Ohtamaa


You can specify timeout globally using socket.setdefaulttimeout().

The timeout may limit how long an individual socket operation may last -- feedparser.parse() may perform many socket operations and therefore the total time spent on dns, establishing the tcp connection, sending/receiving data may be much longer. See Read timeout using either urllib2 or any other http library.

like image 22
jfs Avatar answered Sep 18 '22 22:09

jfs