Is there a standard function that will convert http headers into a python dictionary, and one to convert back?
They would need to support header folding, of course.
Rather than build your own using sockets etc I would use httplib Thus would get the data from the http server and parse the headers into a dictionary e.g.
import httplib
conn = httplib.HTTPConnection("www.python.org")
conn.request("GET", "/index.html")
r1 = conn.getresponse()
dict = r1.getheaders()
print(dict)
gives
[('content-length', '16788'), ('accept-ranges', 'bytes'), ('server', 'Apache/2.2.9 (Debian) DAV/2 SVN/1.5.1 mod_ssl/2.2.9 OpenSSL/0.9.8g mod_wsgi/2.5 Python/2.5.2'), ('last-modified', 'Mon, 15 Feb 2010 07:30:46 GMT'), ('etag', '"105800d-4194-47f9e9871d580"'), ('date', 'Mon, 15 Feb 2010 21:34:18 GMT'), ('content-type', 'text/html')]
and methods for put to send a dictionary as part of a request.
In case you don't find any library solving the problem, here's a naive, untested solution:
def fold(header):
line = "%s: %s" % (header[0], header[1])
if len(line) < 998:
return line
else: #fold
lines = [line]
while len(lines[-1]) > 998:
split_this = lines[-1]
#find last space in longest chunk admissible
split_here = split_this[:998].rfind(" ")
del lines[-1]
lines = lines + [split_this[:split_here]),
split_this[split_here:])] #this may still be too long
#hence the while on lines[-1]
return "\n".join(lines)
def dict2header(data):
return "\n".join((fold(header) for header in data.items()))
def header2dict(data):
data = data.replace("\n ", " ").splitlines()
headers = {}
for line in data:
split_here = line.find(":")
headers[line[:split_here]] = line[split_here:]
return headers
You can copy a request as CURL and then this converter will translate it to a Python request
https://curl.trillworks.com/
BTW if you don't trust the link you can search in Google for "curl converter" and it will probably show in the first results. https://github.com/NickCarneiro/curlconverter
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With