Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

AWS CLI - is there a way to extract tar.gz from S3 to home without storing the tar.gz?

To elaborate,

There is a tar.gz file on my AWS S3, let's call it example.tar.gz.

So, what I want to do is download the extracted contents of example.tar.gz to /var/home/.

One way to do it is to simply download the tar.gz, extract it, then delete the tar.gz.

However, I don't want to use space downloading the tar.gz file, I just want to download the extracted version or only store the extracted version.

Is this possible?

Thanks!

like image 478
Karl Young Avatar asked Sep 15 '17 09:09

Karl Young


People also ask

How do I transfer files from S3 to local using AWS CLI?

You can use cp to copy the files from an s3 bucket to your local system. Use the following command: $ aws s3 cp s3://bucket/folder/file.txt .

How do I unzip a file on AWS S3?

If you head to the Properties tab of your S3 bucket, you can set up an Event Notification for all object “create” events (or just PutObject events). As the destination, you can select the Lambda function where you will write your code to unzip and gzip files.


1 Answers

What you need is the following:

aws s3 cp s3://example-bucket/file.tar.gz - | tar -xz

This will stream the file.tar.gz from s3 and extract it directly (in-memory) to the current directory. No temporary files, no extra storage and no clean up after this one command.

Make sure you write the command exactly as above.

like image 66
mostafazh Avatar answered Sep 23 '22 00:09

mostafazh