Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Golang upload Http request FormFile to Amazon S3

I'm creating a micro service to handle some attachments uploads to Amazon S3, What I'm trying to achieve is accept a file and then store it directly to my Amazon S3 bucket, my current function :

func upload_handler(w http.ResponseWriter, r *http.Request) {
  file, header, err := r.FormFile("attachment")

  if err != nil {
    fmt.Fprintln(w, err)
    return
  }

  defer file.Close()

  fileSize, err := file.Seek(0, 2) //2 = from end
    if err != nil {
         panic(err)
    }

    fmt.Println("File size : ", fileSize)

  bytes := make([]byte, fileSize)
  // read into buffer
  buffer := bufio.NewReader(file)
  _, err = buffer.Read(bytes)

  auth := aws.Auth{
        AccessKey: "XXXXXXXXXXX",
        SecretKey: "SECRET_KEY_HERE",
  }
  client := s3.New(auth, aws.EUWest)
  bucket := client.Bucket("attachments")

  err = bucket.Put(header.Filename, bytes, header.Header.Get("Content-Type"), s3.ACL("public-read"))

  if err != nil {
    fmt.Println(err)
    os.Exit(1)
  }
}

The problem is that the files stored in S3 are all corrupted, After a small verification it seems that the file payload is not read as bytes

How to convert the file to bytes and store it correctly to S3 ?

like image 279
Hamza Avatar asked Mar 15 '23 00:03

Hamza


1 Answers

Use ioutil.ReadAll:

bs, err := ioutil.ReadAll(file)
// ...
err = bucket.Put(
    header.Filename, 
    bs, 
    header.Header.Get("Content-Type"), 
    s3.ACL("public-read"),
)

Read is a lower-level function which has subtle behavior:

Read reads data into p. It returns the number of bytes read into p. It calls Read at most once on the underlying Reader, hence n may be less than len(p). At EOF, the count will be zero and err will be io.EOF.

So what was probably happening was some subset of the file data was being written to S3 along with a bunch of 0s.

ioutil.ReadAll works by calling Read over and over again filling a dynamically expanding buffer until it reaches the end of the file. (so there's no need for the bufio.Reader either)

Also the Put function will have issues with large files (using ReadAll means the entire file must fit in memory) so you may want to use PutReader instead:

bucket.PutReader(
    header.Filename, 
    file, 
    fileSize, 
    header.Header.Get("Content-Type"), 
    s3.ACL("public-read"),
)
like image 68
Caleb Avatar answered Mar 23 '23 03:03

Caleb