Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Memory exhausted error for json_parse with PHP

I have the following code:

<?php
$FILE="giant-data-barf.txt";

$fp = fopen($FILE,'r');

//read everything into data
$data = fread($fp, filesize($FILE));
fclose($fp);

$data_arr = json_decode($data);
var_dump($data_arr);
?>

The file giant-data-barf.txt is, as its name suggests, a huge file(it's 5.4mb right now, but it could go up to several GB)

When I execute this script, I get the following error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 71 bytes) in ........./data.php on line 12

I looked at possible solutions, and saw this:

ini_set('memory_limit','16M');

and my question is, is there a limit to how big I should set my memory? Or is there a better way of solving this problem?

like image 856
Tony Stark Avatar asked Feb 08 '11 23:02

Tony Stark


2 Answers

THIS IS A VERY BAD IDEA, that said, you'll need to set

ini_set('memory_limit',filesize($FILE) + SOME_OVERHEAD_AMOUNT);

because you're reading the entire thing into memory. You may very well have to set the memory limit to two times the size of the file since you also want to JSON_DECODE

NOTE THAT ON A WEB SERVER THIS WILL CONSUME MASSIVE AMOUNTS OF MEMORY AND YOU SHOULD NOT DO THIS IF THE FILE WILL BE MANY GIGABYTES AS YOU SAID!!!!

Is it really a giant JSON blob? You should look at converting this to a database or other format which you can use random or row access with before parsing with PHP.

like image 50
Josh Avatar answered Oct 06 '22 14:10

Josh


I've given all my servers a memory_limit of 100M... didn't run into trouble yet.

I would consider splitting up that file somehow, or get rid of it and use a database

like image 24
david.wosnitza Avatar answered Oct 06 '22 13:10

david.wosnitza