Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caching options in Python or speeding up urlopen

Hey all, I have a site that looks up info for the end user, is written in Python, and requires several urlopen commands. As a result it takes a bit for a page to load. I was wondering if there was a way to make it faster? Is there an easy Python way to cache or a way to make the urlopen scripts fun last?

The urlopens access the Amazon API to get prices, so the site needs to be somewhat up to date. The only option I can think of is to make a script to make a mySQL db and run it ever now and then, but that would be a nuisance.

Thanks!

like image 419
Jill S Avatar asked Apr 08 '26 23:04

Jill S


1 Answers

httplib2 understands http request caching, abstracts urllib/urllib2's messiness somewhat and has other goodies, like gzip support.

http://code.google.com/p/httplib2/

But besides using that to get the data, if the dataset is not very big, I would also implement some kind of function caching / memoizing. Example: http://wiki.python.org/moin/PythonDecoratorLibrary#Memoize

It wouldn't be too hard to modify that decorator to allow for time based expiry, e.g. only cache the result for 15 mins.

If the results are bigger, you need to start looking into memcached/redis.

like image 189
adamJLev Avatar answered Apr 10 '26 13:04

adamJLev



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!