I am using python 3.6 and trying to download json file (350 MB) as pandas dataframe using the code below. However, I get the following error:
data_json_str = "[" + ",".join(data) + "] "TypeError: sequence item 0: expected str instance, bytes found
How can I fix the error?
import pandas as pd # read the entire file into a python array with open('C:/Users/Alberto/nutrients.json', 'rb') as f: data = f.readlines() # remove the trailing "\n" from each line data = map(lambda x: x.rstrip(), data) # each element of 'data' is an individual JSON object. # i want to convert it into an *array* of JSON objects # which, in and of itself, is one large JSON object # basically... add square brackets to the beginning # and end, and have all the individual business JSON objects # separated by a comma data_json_str = "[" + ",".join(data) + "]" # now, load it into pandas data_df = pd.read_json(data_json_str)
Reading JSON Files using Pandas To read the files, we use read_json() function and through it, we pass the path to the JSON file we want to read. Once we do that, it returns a “DataFrame”( A table of rows and columns) that stores data.
Pandas allow you to convert a list of lists into a Dataframe and specify the column names separately. A JSON parser transforms a JSON text into another representation must accept all texts that conform to the JSON grammar. It may accept non-JSON forms or extensions.
From your code, it looks like you're loading a JSON file which has JSON data on each separate line. read_json
supports a lines
argument for data like this:
data_df = pd.read_json('C:/Users/Alberto/nutrients.json', lines=True)
Note
Removelines=True
if you have a single JSON object instead of individual JSON objects on each line.
Using the json module you can parse the json into a python object, then create a dataframe from that:
import json import pandas as pd with open('C:/Users/Alberto/nutrients.json', 'r') as f: data = json.load(f) df = pd.DataFrame(data)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With