I am using go lang as my back end and postgreSQL as my database.I use "github.com/lib/pq" driver to connect with my database. I have a struct which have lot of field and it has some JSON inside it.
My struct looks like this
ApplyLeave1 struct {
LeaveId int
EmpId string
SupervisorEmpId string
LeaveDays float64
MDays float64
LeaveType string
DayType string
LeaveFrom time.Time
LeaveTo time.Time
AppliedDate time.Time
LeaveStatus string
ResultDate time.Time
Certificate []*CertificateInfo
}
CertificateInfo struct {
Id int64
FileName string
FileType string
FileLocation string
}
The struct is big and have some json array inside it and my database schema is as same as the struct. As for as I researched the only way to insert is to use query and insert one by one from mystruct in to the database for example like this
var leave ApplyLeave1
db.Exec("INSERT INTO TABLENAME(leaveid,empid,supervisorid,....) VALUES($1,$2,$3,$4,....)",leave.LeaveId,leave.EmpId,leave.SupervisorId,.....)
This is too long to write as my struct is very big.Is it possible to insert the whole struct at one time and please direct me how to insert a json array inside the database.Appreciate any help.Thanks
Updated information:
CREATE TABLE IF NOT EXISTS apply_leave1
(
leaveid serial PRIMARY KEY NOT NULL ,
empid varchar(10) NOT NULL REFERENCES employee(empid),
leavedays double precision NOT NULL DEFAULT 0 ,
mdays double precision NOT NULL DEFAULT 0 ,
leavetype varchar(20) NOT NULL DEFAULT '' ,
daytype text NOT NULL DEFAULT '',
leavefrom timestamp with time zone NOT NULL,
leaveto timestamp with time zone NOT NULL,
applieddate timestamp with time zone NOT NULL,
leavestatus varchar(15) NOT NULL DEFAULT '' ,
resultdate timestamp with time zone,
certificatestatus bool NOT NULL DEFAULT FALSE,
certificate json[])
Inside certificate(json[]) i will have all the fields of certificateinfo struct which you can see in the above struct information.Here i give certificate as json[]. I don't know which one is better json or json[] for the retrieval. I will be doing following operations and I don't if it is possible with the PostgreSQL.
Sorry for being so elaborate.Since i m new to use json in Postgres I have lots of doubts.So please bear my unawareness questions.Thanks...Appreciate your help.Thanks
You can use https://github.com/jmoiron/sqlx
library. It has NamedExec
function and you can pass variable to it. But first you need to define db
tag for each of your struct field.
Simplified Example:
import (
_ "github.com/lib/pq"
"github.com/jmoiron/sqlx"
"log"
)
type ApplyLeave1 struct {
LeaveId int `db:"leaveid"`
EmpId string `db:"empid"`
SupervisorEmpId string `db:"supervisorid"`
}
db, err := sqlx.Connect("postgres", "user=foo dbname=bar sslmode=disable")
if err != nil {
log.Fatalln(err)
}
query := `INSERT INTO TABLENAME(leaveid, empid, supervisorid)
VALUES(:leaveid, :empid, :supervisorid)`
var leave1 ApplyLeave1
_, err := db.NamedExec(query, leave1)
if err != nil {
log.Fatalln(err)
}
Another approach is to use the github.com/knq/xo
package which relies on code generation (go tool generate
).
This package comes with a tool which takes the schema of your target database and creates a set of Go source code files containing types and code which map to the DB schema (and back).
Compared to github.com/jmoiron/sqlx
, the pros and cons are as follows:
sqlx
is slower because it relies on run-time reflection over the types of the values you submit to it. xo
generates static code which is as fast as you can theoretically get which can make some difference under heavy workloads.xo
requires a separate generation step while sqlx
doesn't.sqlx
allows you to also use other tags on your types—so you may have, say, tags for json
serialization as well. I'm not sure whether xo
tries to preserve your changes done to what it generates.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With