I have an SQL query that joins 2 tables:
SELECT locations.id as id, locations.name as name, states.id as state.id, states.name as state.name FROM locations INNER JOIN states ON states.id = locations.state_id;
And I have a Location type and a State type defined like this:
type State struct {
ID uint `db:"id"`
Name string `db:name"`
}
type Location struct {
ID uint `db:"id"`
Name string `db:name"`
StateID uint `db:"state_id"`
State *State `db:"state"`
}
I want to be able to scan the results of that query into a Location struct using pgx.
I've tried this but it does not work:
// dbConn is a pgxpool.Pool instance and query is a string with the query shown above
rows, err := dbConn.Query(context.Background(), query, pgx.NamedArgs{})
if (err != nil) { log.Fatal(err) }
location, err := pgx.CollectRows[Location](rows, pgx.RowToStructByNameLax[Location])
if (err != nil) { log.Fatal(err) }
If I ran that code I get an error saying struct doesn't have corresponding row field state.id
I've managed to get around that by manually mapping each row like this:
location := Location{}
state := State{}
err := row.Scan(&location.ID, &location.Name, &state.ID, &state.Name)
if (err != nil) { log.Fatal(err) }
location.State = &state
But I was hoping state.* columns would be mapped automatically to Location.State.* fields.
Can something like that be done using pgx? or it's only possible using a library like scany?
As of pgx/v5 the package still has some limitations when scanning nested structs:
To mitigate those limitations your type declarations should be changed accordingly:
type State struct {
ID uint `db:"state.id"` // add `state.` prefix to match column name
Name string `db:"state.name"`
}
type Location struct {
ID uint `db:"id"`
Name string `db:"name"`
StateID uint `db:"state_id"`
State
}
As a side note, while this will work well for your INNER join, other join types returning NULLs may break the scanning.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With