How do I insert for example 100 000 rows into MySQL table with a single query?
insert into $table values (1, a, b), (2, c, d), (3, e, f);
That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.
If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.
Also note what somebody else said about the max_allowed_packet size limiting the length of your query.
You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.
For 100k rows, depending on the size of the rows, you'll probably exceed this.
One way would be to split it up into several chunks. This is probably a good idea anyway.
Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.
LOAD DATA isn't subject to the max_allowed_packet limit.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With