Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I get the IP address from a http request using the requests library?

I am making HTTP requests using the requests library in python, but I need the IP address from the server that responded to the HTTP request and I'm trying to avoid making two calls (and possibly having a different IP address from the one that responded to the request).

Is that possible? Does any python HTTP library allow me to do that?

PS: I also need to make HTTPS requests and use an authenticated proxy.

Update 1:

Example:

import requests  proxies = {   "http": "http://user:[email protected]:3128",   "https": "http://user:[email protected]:1080", }  response = requests.get("http://example.org", proxies=proxies) response.ip # This doesn't exist, this is just an what I would like to do 

Then, I would like to know to which IP address requests are connected from a method or property in the response. In other libraries, I was able to do that by finding the sock object and using the getpeername() function.

like image 497
gawry Avatar asked Mar 18 '14 22:03

gawry


People also ask

Can you get IP address from http request?

You can use RemoteAddr to get the remote client's IP address and port (the format is "IP:port"), which is the address of the original requestor or the last proxy (for example a load balancer which lives in front of your server). This is all you have for sure. This is because internally http. Header.

How do I find the HTTP IP address?

The simplest way to determine the IP address of a website is to use our DNS Lookup Tool. Simply go to the DNS Lookup Tool, type the website URL into the text entry, and select Lookup. You'll notice the search yielded a list of IPv4 addresses that differ from the IPs shown using the other methods.

What is HTTP request library?

Introduction The HTTP Requests library provides a clean and simplified interface to make HTTP requests. The spirit of this library is to make it trivial to do easy things with HTTP requests. It can accomplish the most common use cases for an HTTP client, but does not aim to be a complete HTTP client implementation.


2 Answers

It turns out that it's rather involved.

Here's a monkey-patch while using requests version 1.2.3:

Wrapping the _make_request method on HTTPConnectionPool to store the response from socket.getpeername() on the HTTPResponse instance.

For me on python 2.7.3, this instance was available on response.raw._original_response.

from requests.packages.urllib3.connectionpool import HTTPConnectionPool  def _make_request(self,conn,method,url,**kwargs):     response = self._old_make_request(conn,method,url,**kwargs)     sock = getattr(conn,'sock',False)     if sock:         setattr(response,'peer',sock.getpeername())     else:         setattr(response,'peer',None)     return response  HTTPConnectionPool._old_make_request = HTTPConnectionPool._make_request HTTPConnectionPool._make_request = _make_request  import requests  r = requests.get('http://www.google.com') print r.raw._original_response.peer 

Yields:

('2a00:1450:4009:809::1017', 80, 0, 0) 

Ah, if there's a proxy involved or the response is chunked, the HTTPConnectionPool._make_request isn't called.

So here's a new version patching httplib.getresponse instead:

import httplib  def getresponse(self,*args,**kwargs):     response = self._old_getresponse(*args,**kwargs)     if self.sock:         response.peer = self.sock.getpeername()     else:         response.peer = None     return response   httplib.HTTPConnection._old_getresponse = httplib.HTTPConnection.getresponse httplib.HTTPConnection.getresponse = getresponse  import requests  def check_peer(resp):     orig_resp = resp.raw._original_response     if hasattr(orig_resp,'peer'):         return getattr(orig_resp,'peer') 

Running:

>>> r1 = requests.get('http://www.google.com') >>> check_peer(r1) ('2a00:1450:4009:808::101f', 80, 0, 0) >>> r2 = requests.get('https://www.google.com') >>> check_peer(r2) ('2a00:1450:4009:808::101f', 443, 0, 0) >>> r3 = requests.get('http://wheezyweb.readthedocs.org/en/latest/tutorial.html#what-you-ll-build') >>> check_peer(r3) ('162.209.99.68', 80) 

Also checked running with proxies set; proxy address is returned.


Update 2016/01/19

est offers an alternative that doesn't need the monkey-patch:

rsp = requests.get('http://google.com', stream=True) # grab the IP while you can, before you consume the body!!!!!!!! print rsp.raw._fp.fp._sock.getpeername() # consume the body, which calls the read(), after that fileno is no longer available. print rsp.content   

Update 2016/05/19

From the comments, copying here for visibility, Richard Kenneth Niescior offers the following that is confirmed working with requests 2.10.0 and Python 3.

rsp=requests.get(..., stream=True) rsp.raw._connection.sock.getpeername() 

Update 2019/02/22

Python3 with requests version 2.19.1.

resp=requests.get(..., stream=True) resp.raw._connection.sock.socket.getsockname() 

Update 2020/01/31

Python3.8 with requests 2.22.0

resp = requests.get('https://www.google.com', stream=True) resp.raw._connection.sock.getsockname() 
like image 86
MattH Avatar answered Sep 22 '22 12:09

MattH


Try:

import requests  proxies = {   "http": "http://user:[email protected]:3128",   "https": "http://user:[email protected]:1080", }  response = requests.get('http://jsonip.com', proxies=proxies) ip = response.json()['ip'] print('Your public IP is:', ip) 
like image 38
Sergey Bushmanov Avatar answered Sep 22 '22 12:09

Sergey Bushmanov