I am trying to write a script that will pull current status from our monitoring tools and update them in MS SQL DB. When I call the API I get a HUGE response in JSON format:
{
"hoststatuslist": {
"recordcount": "1084",
"hoststatus": [
{
"@attributes": {
"id": "XXXX"
},
"host_id": "XXX",
"name": "XXXXX",
"display_name": "XXXXXXX",
"address": "XXXXXX",
"alias": "XXXXXX",
"status_text": "XXXXXXXXXXXXXXXXXXXXXXX",
etc.
},
{
"@attributes": {
"id": "XXXX"
},
"host_id": "XXX",
"name": "XXXXX",
"display_name": "XXXXXXX",
"address": "XXXXXX",
"alias": "XXXXXX",
"status_text": "XXXXXXXXXXXXXXXXXXXXXXX",
etc.
},
etc.
]
}
}
As you can see I get over 1000 host objects with attributes. I want to parse the response, so that I can add/update the MS SQL DB. I'm trying to parse out the host_id, name, and status_text for each host.
I tried to do something like this Python - Parsing JSON Data Set but I keep getting errors that the response object has no attribute read or decode.
Here is my current code:
import requests
import json
response = requests.get('url with API Key')
decoded_response = response.read().decode("UTF-8")
data = json.loads(decoded_response)
jsonData = data["hoststatus"]
for host in jsonData:
Name = host.get("name")
StatusText = host.get("status_text")
If anyone has a suggestion to do this with another language or tool I am open. I need to call about 20 APIs and put all the status/other information into a DB so that it's all in one location.
Any help is appreciated.
Like @danil-kondratiev said, you can use response.json() and you don't need to encode/decode. Will this work for you?
import requests
response = requests.get('url with keys')
json_data = response.json() if response and response.status_code == 200 else None
if json_data and 'hoststatuslist' in json_data:
if 'hoststatus' in json_data['hoststatuslist']:
for hoststatus in json_data['hoststatuslist']['hoststatus']:
host_name = hoststatus.get('name')
status_text = hoststatus.get('status_text')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With