Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Read and parse contents of very large file [duplicate]

I am trying to parse a tab delimited file that is ~1GB in size.

Where I run the script i get:

Fatal error: Allowed memory size of 1895825408 bytes exhausted  (tried to allocate 1029206974 bytes) ...

My script at the moment is just:

$file = file_get_contents('allCountries.txt') ;

$file = str_replace(array("\r\n", "\t"), array("[NEW*LINE]", "[tAbul*Ator]"), $file) ;

I have set the memory limit in php.ini to -1, which then gives me:

Fatal error: Out of memory (allocated 1029963776) (tried to allocate 1029206974 bytes)

Is there anyway to partially open the file and then move on to the next part so less memory is used up at one time?

like image 515
imperium2335 Avatar asked Feb 13 '13 08:02

imperium2335


2 Answers

Yes, you can read it line by line:

$handle = @fopen("/tmp/inputfile.txt", "r");
if ($handle) {
    while (($buffer = fgets($handle, 4096)) !== false) {
        echo $buffer;
    }
    fclose($handle);
}
like image 90
Ranty Avatar answered Sep 21 '22 15:09

Ranty


You have to use blocks to read the file. Check the answer of this question. https://stackoverflow.com/a/6564818/1572528

You can also try to use this for less large files.

ini_set('memory_limit', '32M'); //max size 32m
like image 26
Jordi Kroon Avatar answered Sep 20 '22 15:09

Jordi Kroon