How to get cookie from an urllib.request
?
import urllib.request
import urllib.parse
data = urllib.parse.urlencode({
'user': 'user',
'pass': 'pass'
})
data = data.encode('utf-8')
request = urllib.request.urlopen('http://example.com', data)
print(request.info())
request.info()
returns cookies but not in very usable way.
The urllib. parse. urlencode() function takes a mapping or sequence of 2-tuples and returns an ASCII string in this format. It should be encoded to bytes before being used as the data parameter.
The problem here is that urlopen returns a reference to a file object from which you should retrieve HTML. Please note that urllib. urlopen function is marked as deprecated since python 2.6. It's recommended to use urllib2.
response.info()
is a dict
type object. so you can parse any info you need. Here is a demo written in python3:
from urllib import request
from urllib.error import HTTPError
# declare url, header_params
req = request.Request(url, data=None, headers=header_params, method='GET')
try:
response = request.urlopen(req)
cookie = response.info().get_all('Set-Cookie')
content_type = response.info()['Content-Type']
except HTTPError as err:
print("err status: {0}".format(err))
return
You can now, parse cookie
variable as your application requirement.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With