Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

urllib.request.urlretrieve with proxy?

somehow I can't download files trough a proxyserver, and I don't know what i have done wrong. I just get a timeout. Any advice?

import urllib.request

urllib.request.ProxyHandler({"http" : "myproxy:123"})
urllib.request.urlretrieve("http://myfile", "file.file")
like image 344
capitalg Avatar asked Apr 09 '14 15:04

capitalg


1 Answers

You need to use your proxy-object, not just instanciate it (you created an object, but didn't assign it to a variable and therefore can't use it). Try using this pattern:

#create the object, assign it to a variable
proxy = urllib.request.ProxyHandler({'http': '127.0.0.1'})
# construct a new opener using your proxy settings
opener = urllib.request.build_opener(proxy)
# install the openen on the module-level
urllib.request.install_opener(opener)
# make a request
urllib.request.urlretrieve('http://www.google.com')

Or, if you do not need to rely on the std-lib, use requests (this code is from the official documentation):

import requests

proxies = {"http": "http://10.10.1.10:3128",
           "https": "http://10.10.1.10:1080"}

requests.get("http://example.org", proxies=proxies)
like image 84
dorvak Avatar answered Sep 21 '22 16:09

dorvak