I am executing jmeter on AWS EC2, result of which is returned in the form csv file.
I need to upload this csv file to AWS S3 bucket.
Since I am creating number of EC2 instances dynamically and executing jmeter on those instances, it's better to automate this process .
So for this I want to write shell script (as a user data) to execute jmeter and upload result CSV file to S3 bucket from each EC2 instance.
How i can write script for this ?
You can transfer data between compute instances and Amazon S3 buckets on the same Snowball Edge device. You do this by using the supported AWS CLI commands and the appropriate endpoints. For example, assume that you want to move data from a directory in my sbe1.
(Recommended) Upload the file using high-level (aws s3) commands. This example uses the command aws s3 cp, but other aws s3 commands that involve uploading objects into an S3 bucket (for example, aws s3 sync or aws s3 mv) also automatically perform a multipart upload when the object is large.
Consider using command line s3 clients.
S3 command line tools
Also go through some of these sites :
Shell Script To Transfer Files From Amazon S3 Bucket.
aws command line tools
python script to upload file to s3
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With