Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python asynchronous httprequest

I am trying to use twitter search web service in python. I want to call a web service like:

http://search.twitter.com/search.json?q=blue%20angels&rpp=5&include_entities=true&result_type=mixed

from my python program.

Can anybody tell me

  1. how to use xmlhttprequst object in python

  2. how to pass parameters to it, and

  3. how to get the data in dictionary.

Here is my try:

import urllib
import sys
url = "http://search.twitter.com/search.json?q=blue%20angels&rpp=5&include_entities=true&result_type=mixed"
urlobj = urllib.urlopen(url)
data = urlobj.read()
print data

Thanks.

like image 806
hrishikeshp19 Avatar asked Mar 17 '12 23:03

hrishikeshp19


People also ask

Is requests Library async Python?

Import Required Python Libraries for Asynchronous RequestsThe asyncio library is a native Python library that allows us to use async and await in Python. These are the basics of asynchronous requests. The other library we'll use is the `json` library to parse our responses from the API.

Is Python synchronous or asynchronous?

Python code runs at exactly the same speed whether it is written in sync or async style. Aside from the code, there are two factors that can influence the performance of a concurrent application: context-switching and scalability.

How do you call asynchronous in Python?

To run an async function (coroutine) you have to call it using an Event Loop. Event Loops: You can think of Event Loop as functions to run asynchronous tasks and callbacks, perform network IO operations, and run subprocesses. Example 1: Event Loop example to run async Function to run a single async function: Python3.


1 Answers

You don't need "asynchronous httprequest" to use twitter search api:

import json
import urllib
import urllib2

# make query
query = urllib.urlencode(dict(q="blue angel", rpp=5, include_entities=1,
                              result_type="mixed"))  
# make request
resp = urllib2.urlopen("http://search.twitter.com/search.json?" + query)

# make dictionary (parse json response)
d = json.load(resp)

There are probably several libraries that provide a nice OO interface around these http requests.

To make multiple requests concurrently you could use gevent:

import gevent
import gevent.monkey; gevent.monkey.patch_all() # patch stdlib

import json
import urllib
import urllib2

def f(querystr):
    query = urllib.urlencode(dict(q=querystr, rpp=5, include_entities=1,
                                  result_type="mixed"))
    resp = urllib2.urlopen("http://search.twitter.com/search.json?" + query)
    d = json.load(resp)
    print('number of results %d' % (len(d['results']),))

jobs = [gevent.spawn(f, q) for q in ['blue angel', 'another query']]
gevent.joinall(jobs) # wait for completion
like image 108
jfs Avatar answered Sep 22 '22 15:09

jfs