Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP PDO insert multiple (10000+) same rows using bindParam. Good practice?

Tags:

database

mysql

I need to be able to insert from a form request 10'000 + similar row at once. Currently I've done it with a one row prepared statement looped 10'000 times where I re-bindParam each var.

for ($i=0; $i < intval($cloneCount); $i++) 
{
    ... 9 other bindParam
    $insertG->bindParam(':v1', $v1, PDO::PARAM_STR);
    $insertG->bindParam(':v2', $v2, PDO::PARAM_INT);
    $insertG->execute();
}

It takes nearly 30 seconds to achieve and is certainly not a good practice. It's 10'000 today but could be 100'000 tomorrow.

If I insert multiples row in one query with (v1,v2),(v1,v2)... I need to bind each value to a new param thus I believe I would need to have nearly 100'000 bindedParam in one query. If it's UTF-8 and I count around 2 Bytes (I know it can up to 4) per char my Query will be around 10 to 20 MB and the mysql server is on another machine. Saying this I'm surprised it took only 30 sec for my poorly designed request to succeed.

Is there a way to send only one line and tell the mysql server to replicate the last row 10'000 times?

EDIT PARTIAL SOLUTION

Following Bill Karwin and Zsolt Szilagy advices. I managed to get down to 5-6 seconds with the following tweaks for a 10'000 insert to a remote mysql server:

$dataBase->beginTransaction();

$insertG = $dataBase->prepare('INSERT INTO G...)
...
10 * bindParam of all kinds

for ($i=0; $i < 10000; ++$i) 
{ 
    $hashKey = sha1(uniqid().$i); //$hashKey is a binded param
    $insertG->execute();
}
$dataBase->commit();
like image 214
Nicolas Manzini Avatar asked Jun 02 '13 20:06

Nicolas Manzini


Video Answer


2 Answers

You don't need to bindParam() during every iteration of the loop. The bindParam() causes the variables $v1, $v2, etc. to be bound by reference, so all you need to do is change the values of these variables and then re-execute the query. That could cut down on the overhead.

Also you can avoid calling intval() every time through the loop. Just make sure $cloneCount is coerced to integer once, before the loop. That's a very minor improvement, but it's good practice.

$cloneCount = (int) $cloneCount;

... 9 other bindParam
$insertG->bindParam(':v1', $v1, PDO::PARAM_STR);
$insertG->bindParam(':v2', $v2, PDO::PARAM_INT);

for ($i=0; $i < $cloneCount; $i++) 
{
  $v1 = /* something */
  $v2 = /* something */
  $insertG->execute();
}

You should also avoid autocommit. Reduce the transaction overhead of MySQL per statement execution by starting an explicit transaction, inserting several thousand rows, and then committing the transaction.

But the best way to speed up bulk INSERT of thousands of similar rows to a single table is to use LOAD DATA LOCAL INFILE instead of INSERT. This runs 10-20x faster than INSERT row by row, even if you use parameters, transactions, multi-row insert, and any other trick you can think of.

Even if you have to use PHP to write your data into a .CSV file to disk and then use LOAD DATA LOCAL INFILE on that file, it's still much faster.

See also Speed of INSERT Statements in the MySQL manual for more tips.

like image 54
Bill Karwin Avatar answered Oct 08 '22 15:10

Bill Karwin


Build a wrapper object for bulk insert.

You want to have something like $bulkinsert->add($street,$zip); in your loop. It should internally build an query string with multiple inserts:

  insert into table1 (First,Last) values 
  ("Fred","Smith"),
  ("John","Smith"),
  ("Michael","Smith"),
  ("Robert","Smith")
  ...;

I would execute it once after every 100 - 1000 calls of add(). 500 is a good tradeoff between query size and execution time. That way you spare 99.8% of the queries you currently use.

EDIT: As suggesten in another answer, move the count() ot of your loop. Additionally, use ++$i instead of $i++. (Long story short, $i++ creates a call stack overhead usually to be ignored, but you are in a tight loop where microoptimisations matter.)

like image 30
Zsolt Szilagyi Avatar answered Oct 08 '22 16:10

Zsolt Szilagyi