aeson
seems to take a somewhat simple-minded approach to parsing JSON: it parses a top-level JSON value (an object or array) to its own fixed representation and then offers facilities to help users convert that representation to their own. This approach works pretty well when JSON objects and arrays are small. When they're very large, things start to fall apart, because user code can't do anything until JSON values are completely read and parsed. This seems particularly unfortunate since JSON seems to be designed for recursive descent parsers— it seems like it should be fairly simple to allow user code to step in and say how each piece should be parsed. Is there a deep reason aeson
and the earlier json
work this way, or should I try to make a new library for more flexible JSON parsing?
json-stream is a stream based parser. This is a bit out of date (2015), but they took the benchmarks from aeson
and compared the two libraries: aeson and json-stream performance comparison. There is one case where json-stream
is significantly worse than aeson
.
If you just want a faster aeson
(not streaming), haskell-sajson looks interesting. It wraps a performant C++ library in Haskell and returns Value
from aeson
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With