I'm trying to make a simple tool which parses JSON-formatted lines in a file and performs an INSERT
operation into a database.
I have a struct which looks like this:
type DataBlob struct {
....
Datetime time.Time `json:"datetime, string"`
....
}
And parsing code which looks like this:
scanner := bufio.NewScanner(file)
// Loop through all lines in the file
for scanner.Scan() {
var t DataBlob
// Decode the line, parse the JSON
dec := json.NewDecoder(strings.NewReader(scanner.Text()))
if err := dec.Decode(&t);
err != nil {
panic(err)
}
// Perform the database operation
executionString: = "INSERT INTO observations (datetime) VALUES ($1)"
_, err := db.Exec(executionString, t.Datetime)
if err != nil {
panic(err)
}
}
My JSON file has lines, each containing a datetime
value that looks like this:
{ "datetime": 1465793854 }
When the datetime
is formatted as a Unix timestamp, the Marshaller complains:
panic: parsing time "1465793854" as ""2006-01-02T15:04:05Z07:00"": cannot parse "1465793854" as """
In the script that generates the JSON (also written in Golang), I tried simply printing the String representation of the Time.time
type, producing the following:
{ "datetime": "2016-06-13 00:23:34 -0400 EDT" }
To which the Marshaller complains when I go to parse it:
panic: parsing time ""2016-06-13 00:23:34 -0400 EDT"" as ""2006-01-02T15:04:05Z07:00"": cannot parse " 00:23:34 -0400 EDT"" as "T"
If I also treat this timestamp (which looks pretty standard) as a String and avoid the Marshaling problem, Postgres complains when I try to perform the insertion:
panic: pq: invalid input syntax for type timestamp: "2016-06-13 00:23:34 -0400 EDT"
This is frustrating on a number of levels, but mainly because if I serialize a Time.time
type, I would think it should still be understood at the other side of the process.
How can I go about parsing this timestamp to perform the database insertion? Apologizes for the lengthy question and thanks for your help!
We can parse any time by using time. Parse function which takes our time string and format in which our string is written as input and if there is no error in our format, it will return a Golang time object.
Format returns a textual representation of the time value formatted according to layout, which defines the format by showing how the reference time, defined to be Mon Jan 2 15:04:05 -0700 MST 2006 would be displayed if it were the value; it serves as an example of the desired output.
As of ISO 8601-1:2019, the basic format is T[hh][mm][ss] and the extended format is T[hh]:[mm]:[ss]. Earlier versions omitted the T (representing time) in both formats. [hh] refers to a zero-padded hour between 00 and 24. [mm] refers to a zero-padded minute between 00 and 59.
JSON unmarshalling of time.Time
expects date string to be in RFC 3339 format.
So in your golang program that generates the JSON, instead of simply printing the time.Time
value, use Format to print it in RFC 3339 format.
t.Format(time.RFC3339)
if I serialize a Time.time type, I would think it should still be understood at the other side of the process
If you used the Marshaller interface with the serializing, it would indeed output the date in RFC 3339 format. So the other side of the process will understand it. So you can do that as well.
d := DataBlob{Datetime: t}
enc := json.NewEncoder(fileWriter)
enc.Encode(d)
For reference, if you need custom unmarshalling with time types you need to create your own type with the UnmarshallJSON method. Exemple with Timestamp
type Timestamp struct {
time.Time
}
// UnmarshalJSON decodes an int64 timestamp into a time.Time object
func (p *Timestamp) UnmarshalJSON(bytes []byte) error {
// 1. Decode the bytes into an int64
var raw int64
err := json.Unmarshal(bytes, &raw)
if err != nil {
fmt.Printf("error decoding timestamp: %s\n", err)
return err
}
// 2 - Parse the unix timestamp
*&p.Time = time.Unix(raw, 0)
return nil
}
Then use the type in your struct:
type DataBlob struct {
....
Datetime Timestamp `json:"datetime"`
....
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With