Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP memory exhausted ( json_decode)

Tags:

json

php

memory

When my application tries to decodes large (15K~ rows) JSON string (comes from CURL), it fails with:

Allowed memory size of 134217728 bytes exhausted (tried to allocate 91 bytes)

I know I can expand the memory limit or unlimit it, however I'd rather avoid that. I have been wondering whether there is a different approach to address that kind of issue - such splitting the JSON string into small chunks (like array_chunk).

UPDATE

Just to make sure that the issue is not caused by any other function / loop in the app, I've extract the JSON string into a file and tried to decode it directly from the file (file size = 11.8MB). still fails.

$y = json_decode( file_get_contents('/var/tmp/test.txt') );

UPDATE 2 The script runs on Mac OS X environment. I've tested it also on Ubunto env (also 128M memory limit) - and there it works perfectly. should i be concern?

like image 299
Dan Avatar asked Jul 17 '14 13:07

Dan


3 Answers

To permanently avoid this, use an event-based JSON parser like https://github.com/salsify/jsonstreamingparser.

That way, the whole thing doesn't have to be in memory at once. Instead, you process the events which give you one piece of the object/array at a time.

like image 60
humbads Avatar answered Oct 23 '22 22:10

humbads


There are no other PHP functions that let you decode JSON string. You can try on your own or find library to split JSON into parts.

However you should make sure that it's the only problem. For example before decoding JSON data you maybe created big arrays or create many objects.

If I were you, I would save this json string to file and write extra script just to get it from file and decode to make sure that using json_decode makes the only problem.

like image 2
Marcin Nabiałek Avatar answered Oct 23 '22 22:10

Marcin Nabiałek


One of the simplest ways to iterate big json file in php is to use halaxa/json-machine. You only write one foreach. It will never hit memory limit, because it parses one item a at a time behind the scenes, so the memory consumption is constant no matter the file size.

like image 1
Filip Halaxa Avatar answered Oct 24 '22 00:10

Filip Halaxa