Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is it more efficient to construct a massive insert statement or many insert statements?

Tags:

php

mysql

insert

I'm importing a csv file to a mysql db. Haven't looked into bulk insert yet, but was wondering is it more efficient to construct a massive INSERT statement (using PHP) by looping through the values OR is it more efficient to do individual insert of the CSV rows?

like image 458
samxli Avatar asked Jan 27 '26 12:01

samxli


1 Answers

Inserting in bulk is much faster. I'll typically do something like this which imports data 100 records at a time (The 100 record batch size is arbitrary).

$a_query_inserts = array();
$i_progress = 0;

foreach( $results as $a_row ) {

    $i_progress++;
    $a_query_inserts[] = "({$a_row['Column1']}, {$a_row['Column2']}, {$a_row['Column3']})";

    if( count($a_query_inserts) > 100 || $i_progress >= $results->rowCount() ) {

        $s_query = sprintf("INSERT INTO Table
            (Column1,
            Column2,
            Column3)
            VALUES
            %s",
            implode(', ', $a_query_inserts)
        );
        db::getInstance()->query($s_query);

        // Reset batch
        $a_query_inserts = array();
    }
}

There is also a way to load the file directly into the database.

like image 156
Donnie Avatar answered Jan 30 '26 01:01

Donnie