When my application tries to decodes large (15K~ rows) JSON string (comes from CURL), it fails with:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 91 bytes)
I know I can expand the memory limit or unlimit it, however I'd rather avoid that. I have been wondering whether there is a different approach to address that kind of issue - such splitting the JSON string into small chunks (like array_chunk
).
UPDATE
Just to make sure that the issue is not caused by any other function / loop in the app, I've extract the JSON string into a file and tried to decode it directly from the file (file size = 11.8MB). still fails.
$y = json_decode( file_get_contents('/var/tmp/test.txt') );
UPDATE 2 The script runs on Mac OS X environment. I've tested it also on Ubunto env (also 128M memory limit) - and there it works perfectly. should i be concern?
To permanently avoid this, use an event-based JSON parser like https://github.com/salsify/jsonstreamingparser.
That way, the whole thing doesn't have to be in memory at once. Instead, you process the events which give you one piece of the object/array at a time.
There are no other PHP functions that let you decode JSON string. You can try on your own or find library to split JSON into parts.
However you should make sure that it's the only problem. For example before decoding JSON data you maybe created big arrays or create many objects.
If I were you, I would save this json string to file and write extra script just to get it from file and decode to make sure that using json_decode makes the only problem.
One of the simplest ways to iterate big json file in php is to use halaxa/json-machine. You only write one foreach
. It will never hit memory limit, because it parses one item a at a time behind the scenes, so the memory consumption is constant no matter the file size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With