Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Making HTTP requests via Python Requests module not working via proxy where curl does? Why?

Using this curl command I am able to get the response I am looking for from Bash

curl -v -u z:secret_key --proxy http://proxy.net:80  \
-H "Content-Type: application/json" https://service.com/data.json

I have already seen this other post on proxies with the Requests module

And it helped me formulate my code in Python but I need to make a request via a proxy. However, even while supplying the proper proxies it isn't working. Perhaps I'm just not seeing something?

>>> requests.request('GET', 'https://service.com/data.json', \
>>> headers={'Content-Type':'application/json'}, \ 
>>> proxies = {'http' : "http://proxy.net:80",'https':'http://proxy.net:80'}, \
>>> auth=('z', 'secret_key'))

Furthermore, at the same python console I can use urllib to make a request have it be successful.

>>> import urllib
>>> urllib.urlopen("http://www.httpbin.org").read()
---results---

Even trying requests on just a non-https address fails to work.

>>> requests.get('http://www.httpbin.org')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Python/2.6/site-packages/requests/api.py", line 79, in get
   return request('get', url, **kwargs)
File "/Library/Python/2.6/site-packages/requests/api.py", line 66, in request
    prefetch=prefetch
File "/Library/Python/2.6/site-packages/requests/sessions.py", line 191, in request
    r.send(prefetch=prefetch)
File "/Library/Python/2.6/site-packages/requests/models.py", line 454, in send
    raise ConnectionError(e)
requests.exceptions.ConnectionError: Max retries exceeded for url:

Requests is so elegant and awesome but how could it be failing in this instance?

like image 468
dalanmiller Avatar asked Dec 13 '11 00:12

dalanmiller


People also ask

How do I send HTTP request through proxy in Python?

To use a proxy in Python, first import the requests package. Next create a proxies dictionary that defines the HTTP and HTTPS connections. This variable should be a dictionary that maps a protocol to the proxy URL. Additionally, make a url variable set to the webpage you're scraping from.

Which Python package is used for making HTTP requests?

requests - Easily the most popular package for making requests using Python. urllib3 - Not to be confused with urllib , which is part of the Python standard library. httplib2 - Fills some of the gaps left by other libraries. httpx - A newer package that offers HTTP/2 and asynchronous requests.

How do you change request to curl in Python?

Curl Converter automatically generates valid Python code using the Python request library for all provided Curl HTTP headers and Curl data. Enter the Curl command, click Run to execute the command online and check the results. Click Generate Code and select Python to convert the Curl command to Python code.


1 Answers

The problem actually lies with python's standard url access libraries - urllib/urllib2/httplib. I can't remember which library is the exact culprit, but for simplicity's sake, let's just call it urllib. Unfortunately, urllib doesn't implement the HTTP Connect method which is required for accessing an https site through an http(s) proxy. My efforts to add the functionality using urllib have not been successful (it has been a while since I tried). So unfortunately the only option I know to work is to use pycurl for this case.

However, there is a solution which is relatively clean that is almost exactly the same API as python requests, but it uses a pycurl backend instead of the python standard libraries.

The library is called human_curl. I've used it myself and have had great results.

like image 54
ravenac95 Avatar answered Oct 20 '22 11:10

ravenac95