How can I send like 1000 requests the fastest way?
I know that you can send multiple request with grequests
:
urls = [
'sample.url/1',
'sample.url/2',
...
]
request = (grequests.get(u) for u in urls)
print grequests.map(request)
But the return is not the content. What I need is to get the json data, so for example something like this:
request = (grequests.get(u) for u in urls)
content = grequests.json(request)
There are two basic ways to generate concurrent HTTP requests: via multiple threads or via async programming. In multi-threaded approach, each request is handled by a specific thread. In asynchronous programming, there is (usually) one thread and an event loop, which periodically checks for the completion of a task.
- HTTP requests and responses can be pipelined on a connection. Pipelining allows a client to make multiple requests without waiting for each response, allowing a single TCP connection to be used much more efficiently, with much lower elapsed time.
The items returned are not the content, but they do include the content. You can fetch all of the content like so:
result = grequests.map(request)
content = '\n'.join(r.content for r in result) # raw content
text = '\n'.join(r.text for r in result) # decoded content
You can parse the json like this:
result = grequests.map(request)
json = [r.json() for r in result]
Sample program:
import grequests
import pprint
urls = [
'http://httpbin.org/user-agent',
'http://httpbin.org/headers',
'http://httpbin.org/ip',
]
requests = (grequests.get(u) for u in urls)
responses = grequests.map(requests)
json = [response.json() for response in responses]
pprint.pprint(json)
text = '\n'.join(response.text for response in responses)
print(text)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With