I'm writing a web server in go.
On one of the pages, the user can upload a file.
I would like to be able to handle zip files.
In the archive/zip
package, I only see two functions which allow me to read from a zip archive :
func OpenReader(name string) (*ReadCloser, error)
func NewReader(r io.ReaderAt, size int64) (*Reader, error)
I would like to avoid writing and reading back from the disk,
if I want to use the second function, I need to know the size of the uploaded file before calling the function.
Question
I will split my question in two parts :
What would be the idiomatic way to read the unzipped content of a zip file uploaded through a standard multipart/form-data
html form ?
How can I get the actual size of a file uploaded through a html form ?
func(req *http.Request) {
f, h, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
var fileSize int = ??
unzipper, err := zip.NewReader(f, fileSize)
}
You can look for the file size in the FormFile's Header (which is a MIMEHEader).
h.Header.Get("Content-Length")
If there is no content length for the file, you can read it into a buffer first to get the size.
var buff bytes.Buffer
fileSize, err := buff.ReadFrom(f)
Other options are to seek to the end, as you put in your answer, or get the concrete Reader out of the interface. A multipart File will be an *io.SectionReader
if it's in memory, or an *os.File
if it was written to a temp file:
switch f := f.(type) {
case *io.SectionReader:
fileSize = r.Size()
case *os.File:
if s, err := f.Stat(); err == nil {
fileSize = s.Size()
}
}
Here is a way I found to get the size :
func(req *http.Request) {
f, h, err := req.FormFile("fileTag")
if err != nil {
panic(err)
}
fileSize, err := f.Seek(0, 2) //2 = from end
if err != nil {
panic(err)
}
_, err = f.Seek(0, 0)
if err != nil {
panic(err)
}
unzipper, err := zip.NewReader(f, fileSize)
}
I don't find this solution very elegant or idiomatic.
Isn't there some cleaner way to handle this case ?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With