I'm using Pandas 0.12.0 and am seeing some behaviour that contradicts the docs when converting a series or dataframe to json.
If I create a series with a few dates that includes null values, I get something like this:
>>> s = pandas.Series(data=[datetime.datetime.now(), datetime.datetime.now(), None])
>>> s
0 2013-11-07 16:10:47.530771
1 2013-11-07 16:10:47.530782
2 None
dtype: object
According to http://pandas.pydata.org/pandas-docs/dev/io.html#writing-json, when converting to json, None, NaT and NaN values should be output as null.
If I then output to_json, I get a null for the third entry, as expected.
>>> s.to_json()
'{"0":1383840647530771000,"1":1383840647530782000,"2":null}'
However, I need to make sure the datatype is datetime64[ns] for some other calculations, so I convert the fields to datetime in Pandas like so:
>>> t = pandas.to_datetime(s)
>>> t
0 2013-11-07 16:10:47.530771
1 2013-11-07 16:10:47.530782
2 NaT
dtype: datetime64[ns]
The None is now a NaT, which is consistent and expected. Then I try to output json again, I get a negative value for the NaT value instead of the null I was expecting.
>>> t.to_json()
'{"0":1383840647530771000,"1":1383840647530782000,"2":-9223372036854775808}'
It gets even worse when using iso format since it tries to format the date, but most parsers can't figure out how to handle the output date and this wreaks all kinds of havoc down the line.
>>> t.to_json(date_format='iso')
'{"0":"2013-11-07T16:10:47.530771","1":"2013-11-07T16:10:47.530782","2":"0001-255-255T00:00:00"}'
Any thoughts on how I should proceed here? Thanks!
EDIT:
Looks like this is a problem with the string representation of pandas.NaT?
>>> str(pandas.NaT)
'0001-255-255 00:00:00'
A bit hacky, but you could do this
In [13]: s = Series(pd.to_datetime(['20130101',None]))
In [14]: s
0 2013-01-01 00:00:00
1 NaT
dtype: datetime64[ns]
In [15]: def f(x):
if isnull(x):
return 'null'
return x.isoformat() ....:
In [16]: s.apply(f).to_json()
Out[16]:
'{"0":"2013-01-01T00:00:00","1":"null"}'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With