I'm using AWS SDK on golang to send large backup files (2GB~10GB).
When the process starts there is a huge memory consumption. I know that this is because the code reads the file to a buffer, but I'm really new to Go, and I don't know how can I change this.
This is the code I'm using to read the file and send to AWS S3 Uploader:
file, err := os.Open(file_to_upload)
file_name := getFileName(file_to_upload)
if err != nil {
fmt.Println(err)
os.Exit(1)
}
defer file.Close()
fileInfo, _ := file.Stat()
var size int64 = fileInfo.Size()
buffer := make([]byte, size)
file.Read(buffer)
fileBytes := bytes.NewReader(buffer)
fileType := http.DetectContentType(buffer)
upparams := &s3manager.UploadInput{
Bucket: &bucket,
Key: &file_name,
Body: fileBytes,
ACL: aws.String("private"),
ContentType: aws.String(fileType),
Metadata: map[string]*string{
"Key": aws.String("MetadataValue"), //required
},
}
result, err := uploader.Upload(upparams, func(u *s3manager.Uploader){
if partsize != 0{
u.PartSize = int64(partsize) * 1024 * 1024
}
u.LeavePartsOnError = false
u.Concurrency = parallel
})
```
What I've tested so far.
Modification: Changed the u.Concurrency from 5 to 3: Outcome: CPU: Reduced the from 26% to 21% Memory: Same usage
Modification: Changed the u.Concurrency from 5 to 2: Outcome: CPU: Reduced the from 26% to 20% Memory: Same usage
Modification: Changed the u.Concurrency from 5 to 3 and u.Partsize to 100MB: Outcome: CPU: Reduced the from 26% to 16% Memory: Same usage
The time isn't the problem here but the memory consumption.
I want to tune this for the least resources possible. How can I approach to that?
There's no reason to read the entire file into memory. Just provide the Body
field with the file itself
upparams := &s3manager.UploadInput{
Bucket: &bucket,
Key: &file_name,
// *os.File is an io.Reader
Body: file,
ACL: aws.String("private"),
ContentType: aws.String(fileType),
Metadata: map[string]*string{
"Key": aws.String("MetadataValue"), //required
},
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With