I want to store a certain struct into my database that has a JSON field within it.
type Comp struct {
CompId int64 `db:"comp_id" json:"comp_id"`
StartDate time.Time `db:"start_date" json:"start_date"`
EndDate time.Time `db:"end_date" json:"end_date"`
WeeklySchedule json.RawMessage `db:"weekly_schedule" json:"weekly_schedule"`
}
The schema for the table is:
CREATE TABLE IF NOT EXISTS Tr.Comp(
comp_id SERIAL,
start_date timestamp NOT NULL,
end_date timestamp NOT NULL,
weekly_schedule json NOT NULL,
PRIMARY KEY (comp_id)
);
I am using sqlx and lib/pq driver in my project and the following will not execute. Instead it panics saying there is a nil pointer. DB is a global *sqlx.DB
struct
tx := DB.MustBegin()
compFixture := Comp{
StartDate: time.Now(),
EndDate: time.Now().AddDate(1, 0, 0),
WeeklySchedule: json.RawMessage([]byte("{}")),
}
_, err = tx.NamedExec(
`INSERT INTO
Tr.Comp(comp_id,
start_date, end_date, weekly_schedule)
VALUES (DEFAULT,
:start_date, :end_date, :weekly_schedule)
RETURNING comp_id;`, compFixture)
if err != nil {
t.Fatal("Error creating fixture.", err)
}
When I remove weekly_schedule
from the schema and fixture things run fine. But for some reason, the when this field is included, the program panics. Any idea as to how I should define the weekly_schedule
field in both my DB schema and Go struct?
PostgreSQL offers two types for storing JSON data: json and jsonb . To implement efficient query mechanisms for these data types, PostgreSQL also provides the jsonpath data type described in Section 8.14. 7. The json and jsonb data types accept almost identical sets of values as input.
Note how each JSON object is delimited by a new line (without commas) and we don't include the array start [ and end ] characters. This simple command will take your JSON file in array format and convert it into a NDJSON file. Now you can use the COPY command to import your data into Postgres.
Because JSONB stores data in a binary format, queries process significantly faster. Storing data in binary form allows Postgres to access a particular JSON key-value pair without reading the entire JSON record. The reduced disk load speeds up overall performance. Support for indexing.
Reading and Writing JSON Files in GoIt is actually pretty simple to read and write data to and from JSON files using the Go standard library. For writing struct types into a JSON file we have used the WriteFile function from the io/ioutil package. The data content is marshalled/encoded into JSON format.
sqlx has a type JSONText
in github.com/jmoiron/sqlx/types
that will do what you need
doc for JSONText
I don't know how clean of a solution this is but I ended up making my own data type JSONRaw
. The DB driver sees it as a []btye
but it can still be treated like a json.RawMessage in the Go Code.
type JSONRaw json.RawMessage
func (j JSONRaw) Value() (driver.Value, error) {
byteArr := []byte(j)
return driver.Value(byteArr), nil
}
func (j *JSONRaw) Scan(src interface{}) error {
asBytes, ok := src.([]byte)
if !ok {
return error(errors.New("Scan source was not []bytes"))
}
err := json.Unmarshal(asBytes, &j)
if err != nil {
return error(errors.New("Scan could not unmarshal to []string"))
}
return nil
}
func (m *JSONRaw) MarshalJSON() ([]byte, error) {
return *m, nil
}
func (m *JSONRaw) UnmarshalJSON(data []byte) error {
if m == nil {
return errors.New("json.RawMessage: UnmarshalJSON on nil pointer")
}
*m = append((*m)[0:0], data...)
return nil
}
This is copy paste reimplementation of MarshalJSON
and UnmarshalJSON
from the encoding/json
library.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With