Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Insert a lot of data into database in very small inserts

So i have a database where there is a lot of data being inserted from a java application. Usualy i insert into table1 get the last id, then again insert into table2 and get the last id from there and finally insert into table3 and get that id as well and work with it within the application. And i insert around 1000-2000 rows of data every 10-15 minutes.

And using a lot of small inserts and selects on a production webserver is not really good, because it sometimes bogs down the server.

My question is: is there a way how to insert multiple data into table1, table2, table3 without using such a huge amount of selects and inserts? Is there a sql-fu technique i'm missing?

like image 484
Gabriel Avatar asked Jun 28 '10 21:06

Gabriel


1 Answers

Since you're probably relying on auto_increment primary keys, you have to do the inserts one at a time, at least for table1 and table2. Because MySQL won't give you more than the very last key generated.

You should never have to select. You can get the last inserted id from the Statement using the getGeneratedKeys() method. See an example showing this in the MySQL manual for the Connector/J:

http://dev.mysql.com/doc/refman/5.1/en/connector-j-usagenotes-basic.html#connector-j-examples-autoincrement-getgeneratedkeys

Other recommendations:

  • Use multi-row INSERT syntax for table3.
  • Use ALTER TABLE DISABLE KEYS while you're importing, and re-enable them when you're finished.
  • Use explicit transactions. I.e. begin a transaction before your data-loading routine, and commit at the end. I'd probably also commit after every 1000 rows of table1.
  • Use prepared statements.

Unfortunately, you can't use the fastest method for bulk load of data, LOAD DATA INFILE, because that doesn't allow you to get the generated id values per row.

like image 188
Bill Karwin Avatar answered Oct 11 '22 07:10

Bill Karwin