I am very new to Python, so i trying to create a Data frame from my JSON OUTPUT
My json output looks like as shown below
{
"tags":[
{
"stats":{
"rawCount":9
},
"name":"Temperature1",
"results":[
{
"attributes":{
"Location":[
"3rd Floor"
],
"Sensor-Serial-Number":[
"PT100"
]
},
"values":[
[
1460958592800,
24.2,
3
],
[
1460958602800,
24.1,
1
],
[
1460958612800,
23.9,
1
],
[
1460958622800,
24.2,
1
],
[
1460958632800,
24.5,
1
],
[
1460958642800,
24.9,
1
],
[
1460958652800,
24.6,
1
],
[
1460958662800,
24.7,
1
],
[
1460958672800,
24.7,
1
]
],
"groups":[
{
"type":"number",
"name":"type"
}
]
}
]
}
]
}
I only require values , i need to convert into a data frame as shown in the below PIC(click on Timeseries data link)
Timeseries data
Yes, ImportJSON is a really easy tool to use for taking information from JSON and putting it into a table or spreadsheet. Including if you want to parse your JSON directly from Google Sheets!
try this to pull out only a list of values
from your json
import json
import ast
import pandas as pd
mystr = """
{'tags': [{'name': 'Temperature1',
'results': [{'attributes': {'Location': ['3rd Floor'],
'Sensor-Serial-Number': ['PT100']},
'groups': [{'name': 'type', 'type': 'number'}],
'values': [[1460958592800, 24.2, 3],
[1460958602800, 24.1, 1],
[1460958612800, 23.9, 1],
[1460958622800, 24.2, 1],
[1460958632800, 24.5, 1],
[1460958642800, 24.9, 1],
[1460958652800, 24.6, 1],
[1460958662800, 24.7, 1],
[1460958672800, 24.7, 1]]}],
'stats': {'rawCount': 9}}]}
"""
val = ast.literal_eval(mystr)
val1 = json.loads(json.dumps(val))
val2 = val1['tags'][0]['results'][0]['values']
print pd.DataFrame(val2, columns=["time", "temperature", "quality"])
the result turns out to be
time temperature quality
0 1460958592800 24.2 3
1 1460958602800 24.1 1
2 1460958612800 23.9 1
3 1460958622800 24.2 1
4 1460958632800 24.5 1
5 1460958642800 24.9 1
6 1460958652800 24.6 1
7 1460958662800 24.7 1
8 1460958672800 24.7 1
which is your table for dataset
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With