Currently I have this dictionary, printed using pprint:  
{'AlarmExTempHum': '\x00\x00\x00\x00\x00\x00\x00\x00',   'AlarmIn': 0,   'AlarmOut': '\x00\x00',   'AlarmRain': 0,   'AlarmSoilLeaf': '\x00\x00\x00\x00',   'BarTrend': 60,   'BatteryStatus': 0,   'BatteryVolts': 4.751953125,   'CRC': 55003, 'EOL': '\n\r', 'ETDay': 0, 'ETMonth': 0, 'ETYear': 0, 'ExtraHum1': None, 'ExtraHum2': None, 'ExtraHum3': None, 'ExtraHum4': None, 'ExtraHum5': None, 'ExtraHum6': None, 'ExtraHum7': None, 'ExtraTemp1': None, 'ExtraTemp2': None, 'ExtraTemp3': None, 'ExtraTemp4': None, 'ExtraTemp5': None, 'ExtraTemp6': None, 'ExtraTemp7': None, 'ForecastIcon': 2, 'ForecastRuleNo': 122, 'HumIn': 31, 'HumOut': 94, 'LOO': 'LOO', 'LeafTemps': '\xff\xff\xff\xff', 'LeafWetness': '\xff\xff\xff\x00', 'NextRec': 37, 'PacketType': 0, 'Pressure': 995.9363359295631, 'RainDay': 0.0, 'RainMonth': 0.0, 'RainRate': 0.0, 'RainStorm': 0.0, 'RainYear': 2.8, 'SoilMoist': '\xff\xff\xff\xff', 'SoilTemps': '\xff\xff\xff\xff', 'SolarRad': None, 'StormStartDate': '2127-15-31', 'SunRise': 849, 'SunSet': 1611, 'TempIn': 21.38888888888889, 'TempOut': 0.8888888888888897, 'UV': None, 'WindDir': 219, 'WindSpeed': 3.6, 'WindSpeed10Min': 3.6}   When I do this:
import json d = (my dictionary above) jsonarray = json.dumps(d)   I get this error: 'utf8' codec can't decode byte 0xff in position 0: invalid start byte
Python possesses a default module, 'json,' with an in-built function named dumps() to convert the dictionary into a JSON object by importing the "json" module.
You can convert a dictionary to a JSON string using the json. dumps() method. The process of encoding the JSON is usually called serialization. That term refers to transforming data into a series of bytes (hence serial) stored or transmitted across the network.
Introducing JSON JSON is a way of representing Arrays and Dictionaries of values ( String , Int , Float , Double ) as a text file. In a JSON file, Arrays are denoted by [ ] and dictionaries are denoted by { } .
json supports arrays via brackets. There is a difference between json and python dict. it also uses unhelpful language like "sucks" and makes subjective statements like "makes code look dirty" without backing up the claim. Yeah, this answer is conflating JSON (a format) with the way property access works in Javascript.
If you are fine with non-printable symbols in your json, then add ensure_ascii=False to dumps call.
>>> json.dumps(your_data, ensure_ascii=False)   If
ensure_asciiis false, then the return value will be aunicodeinstance subject to normal Pythonstrtounicodecoercion rules instead of being escaped to an ASCIIstr.
ensure_ascii=False really only defers the issue to the decoding stage:
>>> dict2 = {'LeafTemps': '\xff\xff\xff\xff',} >>> json1 = json.dumps(dict2, ensure_ascii=False) >>> print(json1) {"LeafTemps": "����"} >>> json.loads(json1) Traceback (most recent call last):   File "<stdin>", line 1, in <module>   File "/usr/lib/python2.7/json/__init__.py", line 328, in loads     return _default_decoder.decode(s)   File "/usr/lib/python2.7/json/decoder.py", line 365, in decode     obj, end = self.raw_decode(s, idx=_w(s, 0).end())   File "/usr/lib/python2.7/json/decoder.py", line 381, in raw_decode     obj, end = self.scan_once(s, idx) UnicodeDecodeError: 'utf8' codec can't decode byte 0xff in position 0: invalid start byte   Ultimately you can't store raw bytes in a JSON document, so you'll want to use some means of unambiguously encoding a sequence of arbitrary bytes as an ASCII string - such as base64.
>>> import json >>> from base64 import b64encode, b64decode >>> my_dict = {'LeafTemps': '\xff\xff\xff\xff',}  >>> my_dict['LeafTemps'] = b64encode(my_dict['LeafTemps']) >>> json.dumps(my_dict) '{"LeafTemps": "/////w=="}' >>> json.loads(json.dumps(my_dict)) {u'LeafTemps': u'/////w=='} >>> new_dict = json.loads(json.dumps(my_dict)) >>> new_dict['LeafTemps'] = b64decode(new_dict['LeafTemps']) >>> print new_dict {u'LeafTemps': '\xff\xff\xff\xff'} 
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With