This is the first time I've tried to use a library with less-than-ideal levels of documentation and example code, so bear with me. I have a tiny bit of experience with the Requests library, but I need to send separate requests to a specific address every second:
I can't figure out how to satisfy these conditions simultaneously. grequests.map()
will give me the responses' content that I want, but only in a batch after they've all completed. grequests.send()
seems to only return a response object that doesn't contain the html text of the web page. (I may be wrong about grequests.send()
, but I haven't yet found an example that pulls content from that object)
Here's the code that I have so far:
import grequests
from time import sleep
def print_res(res, **kwargs):
print res
print kwargs
headers = {'User-Agent':'Python'}
req = grequests.get('http://stackoverflow.com', headers=headers, hooks=dict(response=print_res), verify=False)
for i in range(3):
job = grequests.send(req, grequests.Pool(10))
sleep(1)
The response I get:
1
<Response [200]>
{'verify': False, 'cert': None, 'proxies': {'http': 'http://127.0.0.1:8888', 'ht
tps': 'https://127.0.0.1:8888'}, 'stream': False, 'timeout': None}
2
<Response [200]>
{'verify': False, 'cert': None, 'proxies': {'http': 'http://127.0.0.1:8888', 'ht
tps': 'https://127.0.0.1:8888'}, 'stream': False, 'timeout': None}
3
<Response [200]>
{'verify': False, 'cert': None, 'proxies': {'http': 'http://127.0.0.1:8888', 'ht
tps': 'https://127.0.0.1:8888'}, 'stream': False, 'timeout': None}
I've tried accessing the html response with req.content
, and job.content
, but neither work.
Of course, while writing up this question I realized that I hadn't tried to access res.content
, which turns out to be exactly what I needed.
Lesson learned: The object that is returned to the response hook in the grequests.get()
statement has a content
attribute which contains the text of the response sent from the server.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With