I want to load a JSONL file as JSON objects in python. Is there an easy way to do so?
JSONL uses UTF-8 encoding. That is different from JSON, which allows encoding Unicode strings using ASCII escape sequences. Each line is a valid JSON value. Each line is separated with a newline, '\n', character.
Full steps including file operations for beginners like me
Assuming you have a .jsonl
file like:
{"reviewerID": "A2IBPI20UZIR0U", "asin": "1384719342", "reviewerName": "cassandra tu \"Yeah, well, that's just like, u...", "helpful": [0, 0], "reviewText": "Not much to write about here, but it does exactly what it's supposed to. filters out the pop sounds. now my recordings are much more crisp. it is one of the lowest prices pop filters on amazon so might as well buy it, they honestly work the same despite their pricing,", "overall": 5.0, "summary": "good", "unixReviewTime": 1393545600, "reviewTime": "02 28, 2014"} {"reviewerID": "A14VAT5EAX3D9S", "asin": "1384719342", "reviewerName": "Jake", "helpful": [13, 14], "reviewText": "The product does exactly as it should and is quite affordable.I did not realized it was double screened until it arrived, so it was even better than I had expected.As an added bonus, one of the screens carries a small hint of the smell of an old grape candy I used to buy, so for reminiscent's sake, I cannot stop putting the pop filter next to my nose and smelling it after recording. :DIf you needed a pop filter, this will work just as well as the expensive ones, and it may even come with a pleasing aroma like mine did!Buy this product! :]", "overall": 5.0, "summary": "Jake", "unixReviewTime": 1363392000, "reviewTime": "03 16, 2013"}
This code should work:
import json with open('./data/my_filename.jsonl', 'r') as json_file: json_list = list(json_file) for json_str in json_list: result = json.loads(json_str) print(f"result: {result}") print(isinstance(result, dict))
About .jsonl
files:
http://jsonlines.org/
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With