Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

aiohttp-form-based authentication

I cannot find working code of aiohttp in combination with a login page. The goal is simple: form-based authentication with a username and password, which cookie I would like to use in subsequent aiohttp async fetch calls.

It seems that the whole Session concept changed in aiohttp between versions, so I'm curious how can I implement it in the most recent version. I'm not sure how to get the cookie once and then use it in the asynchronous matter.

I'd really like to see a fully working example, since unfortunately I wasn't able to get it working with the snippets I found everywhere.

I guess this might be the start, but I'm not sure and I certainly do not see how I can connect everything to it (do I also still need a aiohttp.TCPConnector?) http://aiohttp.readthedocs.org/en/latest/client_reference.html#aiohttp.client.ClientSession

Example of my non-async version in Python 2 using mechanize (though I naturally use Python 3 for asyncio etc):

import mechanize
import urllib

class MyClass()
    def __init__(self):
        self.data = {'username' : 'me', 'password' : 'pw'}
        self.login_url = 'http://example.com/login'
        self.login()

    def call(self, url):
        request2 = mechanize.Request(url)
        self.cookie_jar.add_cookie_header(request2)
        response2 = mechanize.urlopen(request2).read()
        return response2    

    def login(self):
        request = mechanize.Request(self.login_url)
        # 'username' and 'password' keys are actually the name of the <input>
        logInfo = urllib.urlencode({'username' : self.data['username'], 
                                    'password' : self.data['password']})
        response = mechanize.urlopen(request, data = logInfo)
        cookie_jar = mechanize.CookieJar()
        cookie_jar.extract_cookies(response, request)
        self.cookie_jar = cookie_jar

mc = MyClass()
mc.call('http://example.com/other_url')
like image 414
PascalVKooten Avatar asked Jul 29 '15 23:07

PascalVKooten


People also ask

Is Aiohttp better than requests?

get is that requests fetches the whole body of the response at once and remembers it, but aiohttp doesn't. aiohttp lets you ignore the body, or read it in chunks, or read it after looking at the headers/status code. That's why you need to do a second await : aiohttp needs to do more I/O to get the response body.

What does Aiohttp ClientSession do?

By default the aiohttp. ClientSession object will hold a connector with a maximum of 100 connections, putting the rest in a queue. This is quite a big number, this means you must be connected to a hundred different servers (not pages!) concurrently before even having to consider if your task needs resource adjustment.

How do I pass headers in Aiohttp?

If you need to add HTTP headers to a request, pass them in a dict to the headers parameter. await session. post(url, data='Привет, Мир!


1 Answers

I've just added example for basic auth on client side: client_auth.py

Is it enough for you?

P.S. Actually ClientSession is replacement for old-style request+connector concept. Session is more natural way to save session-related info. But old way is still working.

like image 108
Andrew Svetlov Avatar answered Sep 27 '22 21:09

Andrew Svetlov