I am now using batch:
String query = "INSERT INTO table (id, name, value) VALUES (?, ?, ?)"; PreparedStatement ps = connection.prepareStatement(query); for (Record record : records) { ps.setInt(1, record.id); ps.setString(2, record.name); ps.setInt(3, record.value); ps.addBatch(); } ps.executeBatch();
I am just wondering if the above code is equivalent to the following code. If not, which is faster?
String query = "INSERT INTO table (id, name, value) VALUES "; for (Record record : records) { query += "(" + record.id + ",'" + record.name + "'," + record.value + "),"; } query = query.substring(1, query.length() - 1); PreparedStatement ps = connection.prepareStatement(query); ps.executeUpdate();
Answer. Yes, instead of inserting each row in a separate INSERT statement, you can actually insert multiple rows in a single statement. To do this, you can list the values for each row separated by commas, following the VALUES clause of the statement.
executeBatch
will have an improved performance over executeUpdate
as long as autocommit is set to false:
connection.setAutoCommit(false); PreparedStatement ps = connection.prepareStatement(query); for (Record record : records) { // etc. ps.addBatch(); } ps.executeBatch(); connection.commit();
First of all, with query string concatenation you not only lose the type conversion native to PreparedStatement methods, but you also get vulnerable to malicious code being executed in the database.
Second, PreparedStatements are previously cached in the very database itself, and this already gives a very good performance improvement over plain Statements.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With