Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

HTTPS proxies not working with Python's requests module

I'm pretty new to Python and I've been using their requests module as a substitute for PHP's cURL library. My code is as follows

import requests
import json
import os
import urllib
import math
import sys

def main() :    
   url = 'https://api.com'

   headers = {'Content-Type': 'application/json; charset=utf-8',
              'User-Agent': '(iPhone; iOS 7.0.4; Scale/2.00)'}

   d = {'token': "12345"}

   proxies = {
      "https": "https://27.254.52.99:8080",
   }

   post = json.dumps(d);
   r = requests.post(url, data=post, headers=headers, proxies=proxies)
   print r.json

if __name__ == "__main__":
    main()

However, I'm greeted with the following error:

File "test.py", line 42, in test
r = requests.post(url, data=post, headers=headers, proxies=proxies)
File "/Library/Python/2.7/site-packages/requests-2.2.1-py2.7.egg/requests/api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "/Library/Python/2.7/site-packages/requests-2.2.1-py2.7.egg/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/Library/Python/2.7/site-packages/requests-2.2.1-py2.7.egg/requests/sessions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "/Library/Python/2.7/site-packages/requests-2.2.1-py2.7.egg/requests/sessions.py", line 486, in send
r = adapter.send(request, **kwargs)
File "/Library/Python/2.7/site-packages/requests-2.2.1-py2.7.egg/requests/adapters.py", line 381, in send
raise ProxyError(e)
ProxyError: Cannot connect to proxy. Socket error: [Errno 54] Connection reset by peer.
like image 660
Lance Avatar asked Jun 05 '14 10:06

Lance


People also ask

How do I send HTTP request through proxy in Python?

To use a proxy in Python, first import the requests package. Next create a proxies dictionary that defines the HTTP and HTTPS connections. This variable should be a dictionary that maps a protocol to the proxy URL. Additionally, make a url variable set to the webpage you're scraping from.

Does requests work with https?

The only difference between the two protocols is that HTTPS uses TLS (SSL) to encrypt normal HTTP requests and responses, and to digitally sign those requests and responses. As a result, HTTPS is far more secure than HTTP. A website that uses HTTP has http:// in its URL, while a website that uses HTTPS has https://.

How do I bypass Python proxy?

getproxies return a not empty dict (urllib. getproxies=lambda: {'z':'z'}). then requests will not get proxy setting from the env and os settings. trust_env = False solution worked perfectly, thanks for the solution!

How do I send a proxy request?

Using a custom proxy for requestsSelect the Proxy tab. Under Proxy configurations for sending requests, select the checkbox next to Add a custom proxy configuration. Enter information about the custom proxy: Proxy Type - Select the type of requests you want to send through the proxy server.


1 Answers

Edit June 2019: This reply is not relevant anymore. Issue(s) are fixed.

Edit 2: "note that even for https proxy, the proxy address' scheme is http, it's because the client and proxy server initiate the tunnelling(the CONNECT method) in plain http. However, that's may not be true 3 years ago." - From the comments

HTTPS is 'bugged' in requests. I don't know the specifics but you can find a few other topics on this website concerning the issue. Also a Github issue is still active here. I'm suspecting you're having the problems mentioned there. If I am totally wrong, someone correct me.

To verify:

$~ curl --proxy https://27.254.52.99:8080 icanhazip.com
27.254.52.99

Works, but then in Python:

>>> proxies={'https': 'https://27.254.52.99:8080'}
>>> r = requests.get('http://icanhazip.com', headers={'User-Agent': 'Bla'}, proxies=proxies)
print r.content
<my ipv6 address comes up>

As you can see, my address comes up which means the proxy did nothing.

I don't understand why you are receiving a stacktrace. Maybe because your API is on HTTPS as well (?). Or maybe your API is just... down.

Anyway, the proxy does work in requests if its over HTTP.

>>> proxies={'http': 'http://27.254.52.99:8080'}
>>> r = requests.head('http://icanhazip.com', headers={'User-Agent': 'Bla'}, proxies=proxies)
print r.content
27.254.52.99
like image 102
cpb2 Avatar answered Oct 22 '22 09:10

cpb2