I've successfully written a script that downloads a CSV file with the use of cURL and then parses the CSV into a array as follows:
$rows = array_map(function($a) {
return str_getcsv($a, $delimiter);
}, explode("\n", $result));
Then I iterate over $rows
using foreach
to save something into the database.
The script works fine, but when using a larger CSV file (>10.000 lines), the script becomes rather slow and gives a lot more errors.
I would like to cut the CSV file into pieces, so not the whole file will be imported into a variable. I found the following solution, but that still processes the whole file at once.
Is there a method to cut the CSV into pieces and run the database function several times? Or is there a better way to process a large CSV file like this?
I'm relatively new to processing large files, so please be kind!
Save the file somewhere and then process it in chunks like this:
<?php
$filePath = 'big.csv';
//How many rows to process in each batch
$limit = 100;
$fileHandle = fopen($filePath, "r");
if ($fileHandle === FALSE)
{
die('Error opening '.$filePath);
}
//Set up a variable to hold our current position in the file
$offset = 0;
while(!feof($fileHandle))
{
//Go to where we were when we ended the last batch
fseek($fileHandle, $offset);
$i = 0;
while (($currRow = fgetcsv($fileHandle)) !== FALSE)
{
$i++;
//Do something with the current row
print implode(', ', $currRow)."\n";
//If we hit our limit or are at the end of the file
if($i >= $limit)
{
//Update our current position in the file
$offset = ftell($fileHandle);
//Break out of the row processing loop
break;
}
}
}
//Close the file
fclose($fileHandle);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With