This is a snippet of the first part of my code
package main
import (
"encoding/csv"
"fmt"
"os"
)
func main() {
file, err := os.Open("Account_balances.csv")
if err != nil {
fmt.Println("Error", err)
return
}
defer file.Close()
reader := csv.NewReader(file)
record, err := reader.ReadAll()
if err != nil {
fmt.Println("Error", err)
}
for value:= range record{ // for i:=0; i<len(record)
fmt.Println("", record[value])
}
}
I want to write code that saves the CSV file in any database (i.e SQL, SQLite or PostgreSQL).
There are many methods of converting CSV data into a database table format. One of the ways is to create a new table and copy all the data from the CSV file to the table. However, copy and pasting data can be extremely cumbersome and time-consuming if the dataset is very large.
Importing a CSV file into SQL Server can be done within PopSQL by using either BULK INSERT or OPENROWSET(BULK...) command. The BULK INSERT command is used if you want to import the file as it is, without changing the structure of the file or having the need to filter data from a file.
The Go MySQL driver supports loading from file:
See https://github.com/go-sql-driver/mysql#load-data-local-infile-support and https://godoc.org/github.com/go-sql-driver/mysql#RegisterLocalFile.
RegisterLocalFile adds the given file to the file whitelist, so that it can be used by "LOAD DATA LOCAL INFILE ". Alternatively you can allow the use of all local files with the DSN parameter 'allowAllFiles=true'
filePath := "/home/gopher/data.csv" mysql.RegisterLocalFile(filePath) err := db.Exec("LOAD DATA LOCAL INFILE '" + filePath + "' INTO TABLE foo") if err != nil { ...
Each DB engine has different ways for importing CSVs in an optimized way. You should use them instead of writing you own methods for reading CSVs and mass inserting records.
Refs:
MySQL: https://dev.mysql.com/doc/refman/5.7/en/load-data.html
PgSQL: https://www.postgresql.org/docs/current/static/sql-copy.html
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With