Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I batch sql statements with package database/sql

Tags:

sql

go

How do I batch sql statements with Go's database/sql package?

In Java I would do it like this :

// Create a prepared statement String sql = "INSERT INTO my_table VALUES(?)"; PreparedStatement pstmt = connection.prepareStatement(sql);  // Insert 10 rows of data for (int i=0; i<10; i++) {     pstmt.setString(1, ""+i);     pstmt.addBatch(); }  // Execute the batch int [] updateCounts = pstmt.executeBatch(); 

How would I achieve the same in Go?

like image 233
barnardh Avatar asked Sep 18 '12 23:09

barnardh


People also ask

How do you do batching in SQL?

A batch of SQL statements is a group of two or more SQL statements or a single SQL statement that has the same effect as a group of two or more SQL statements. In some implementations, the entire batch statement is executed before any results are available.

What is Tx SQL?

You can execute database transactions using an sql. Tx, which represents a transaction. In addition to Commit and Rollback methods representing transaction-specific semantics, sql. Tx has all of the methods you use to perform common database operations.

What is GOS DB?

Geographical Operations System, mapping and database software for telecommunications companies.

How do you query in go?

You can query for multiple rows using Query or QueryContext , which return a Rows representing the query results. Your code iterates over the returned rows using Rows. Next . Each iteration calls Scan to copy column values into variables.


2 Answers

Since the db.Exec function is variadic, one option (that actually does only make a single network roundtrip) is to construct the statement yourself and explode the arguments and pass them in.

Sample code:

func BulkInsert(unsavedRows []*ExampleRowStruct) error {     valueStrings := make([]string, 0, len(unsavedRows))     valueArgs := make([]interface{}, 0, len(unsavedRows) * 3)     for _, post := range unsavedRows {         valueStrings = append(valueStrings, "(?, ?, ?)")         valueArgs = append(valueArgs, post.Column1)         valueArgs = append(valueArgs, post.Column2)         valueArgs = append(valueArgs, post.Column3)     }     stmt := fmt.Sprintf("INSERT INTO my_sample_table (column1, column2, column3) VALUES %s",                          strings.Join(valueStrings, ","))     _, err := db.Exec(stmt, valueArgs...)     return err } 

In a simple test I ran, this solution is about 4 times faster at inserting 10,000 rows than the Begin, Prepare, Commit presented in the other answer - though the actual improvement will depend a lot on your individual setup, network latencies, etc.

like image 166
Andrew C Avatar answered Oct 11 '22 15:10

Andrew C


If you’re using PostgreSQL then pq supports bulk imports.

like image 36
Avi Flax Avatar answered Oct 11 '22 13:10

Avi Flax