I'm trying to read a json and get its values. I have a folder with the JSON's archives, and I need to open all archives and get the values from them.
This is the code:
# -*- encoding: utf-8 -*-
from pprint import pprint
import json
import os
def start():
for dirname, dirnames, filenames in os.walk('test'):
for filename in filenames:
json_file = open(os.path.join(dirname, filename)).read()
# json_file = unicode(json_file, 'utf-8')
json_data = json.loads(json_file)
pprint(json_data)
for key, value in json_data.items():
print "KEY : ", key
print "VALUE: ", value
start()
This is one of the JSON's
{ "test" : "Search User 1",
"url" : "http://127.0.0.1:8000/api/v1/user/1/?format=json",
"status_code" : 200,
"method" : "get"
}
But when I run it, i get this:
ValueError: No JSON object could be decoded
What the hell is wrong? Yesterday it was working exactly as it is now, or am I crazy
I tried this way:
from pprint import pprint
import json
import os
for dirname, dirnames, filenames in os.walk('test'):
for filename in filenames:
json_file_contents = open(os.path.join(dirname, filename)).read()
try:
json_data = json.loads(json_file_contents)
except ValueError, e:
print e
print "ERROR"
I cant see any error '-'
for filename in filenames:
with open(os.path.join(dirname,filename)) as fd:
json_data = fd.read()
print json_data
This way I can see what the json files contain, but I can't use for example access by the key, like json_data['url']
It's possible the .read()
method is moving the cursor to the end of the file. Try:
for filename in filenames:
with open(os.path.join(dirname,filename)) as fd:
json_data = json.load(fd)
and see where that gets you.
This, of course, assumes you have valid JSON, as your example demonstrates. (Look out for trailing commas)
For me it was an encoding problem, you can try using Notepad++ to edit your .json file and change the Encoding to UTF-8 without BOM. Another thing you could check is if your json script is valid
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With