I have a large csv that I want to parse and insert into my database. I have this PHP:
$target = '../uploads/'.$f;
$handle = fopen($target, "r");
$data = fgetcsv($handle, 0, ",");
$rows = array();
while ($data !== FALSE) {
$rows[] = $data;
}
fclose($handle);
if (count($rows)) {
foreach ($rows as $key => $value) {
echo $value;
}
}
Every time I try to run my script I get this error:
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 35 bytes)
Any ideas how to do this?
An "Out of Memory" error can occur when a Database Node Memory (KB) becomes less than 2 percent of the target size, and it cannot discard database pages on the node anymore to get free pages.
To read large text files in Python, we can use the file object as an iterator to iterate over the file and perform the required task. Since the iterator just iterates over the entire file and does not require any additional data structure for data storage, the memory consumed is less comparatively.
I think this part is wrong:
$data = fgetcsv($handle, 0, ",");
$rows = array();
while ($data !== FALSE) {
$rows[] = $data;
}
One call to fgetcsv
reads one line from $handle
. You need to put fgetcsv
in the loop condition:
$handle = fopen($target, "r");
$data = fgetcsv($handle, 0, ",");
while (($row = fgetcsv($handle, 0, ",")) !== FALSE) {
// Example insert - obviously use prepared statements/escaping/another DAL
$db->query("INSERT INTO tbl (columns..) VALUES ({$row[0]}, {$row[1]} ... )");
}
While you can certainly parse and build queries with PHP you'll get much better performance by letting MySQL handle it directly. Your database will thank you.
<?php
exec("mysqlimport mysqlimport [options] db_name textfile1");
?>
Sources:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With