Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python - requests fail silently

import requests

is working properly for all my requests, like so:

url = 'http://www.stackoverflow.com'
response = requests.get(url)

bur the following url does not return any results:

url = 'http://www.billboard.com'
response = requests.get(url)

it stalls and fails silently, returning nothing.

how do I force requests into throwing me an exception response, so I can know if I'm being blacklisted or else?

like image 262
8-Bit Borges Avatar asked Sep 10 '25 18:09

8-Bit Borges


1 Answers

Requests won't raise an exception for a bad HTTP response, but you could use raise_for_status to raise a HTTPError exception manually, example:

response = requests.get(url)

response.raise_for_status()

Another option is status_code, which holds the HTTP code.

response = requests.get(url)

if response.status_code != 200:
    print('HTTP', response.status_code)
else: 
    print(response.text)

If a site returns HTTP 200 for bad requests, but has an error message in the response body or has no body, you'll have to check the response content.

error_message = 'Nothing found'
response = requests.get(url)

if error_message in response.text or not response.text:
    print('Bad response')
else: 
    print(response.text)

If a site takes too long to respond you could set a maximum timeout for the request. If the site won't respond in that time a ReadTimeout exception will be raised.

try:
    response = requests.get(url, timeout=5)
except requests.exceptions.ReadTimeout:
    print('Request timed out')
else:
    print(response.text)
like image 126
t.m.adam Avatar answered Sep 13 '25 12:09

t.m.adam



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!