Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Equivalent of "du" command on a Amazon S3 bucket

I'm looking for a solution to recursively get the size of all my folders on a Amazon S3 bucket which has a lot of embedded folders.

The perfect example is the Linux du --si command:

12M ./folder1
50M ./folder2
50M ./folder2/subfolder1
etc...

I'm also open to any graphical tool. Is there any command or AWS API for that?

like image 947
Sylvain Avatar asked Mar 21 '17 15:03

Sylvain


People also ask

Can you run scripts on S3?

This section describes how to download and run scripts from Amazon Simple Storage Service (Amazon S3). You can run different types of scripts, including Ansible Playbooks, Python, Ruby, Shell, and PowerShell. You can also download a directory that includes multiple scripts.

What is AWS S3 cp command?

The following cp command uploads a local file stream from standard input to a specified bucket and key: aws s3 cp - s3://mybucket/stream.txt. Uploading a local file stream that is larger than 50GB to S3. The following cp command uploads a 51GB local file stream from standard input to a specified bucket and key.


2 Answers

Use awscli

aws s3 ls s3://bucket --recursive --human-readable --summarize
like image 146
franklinsijo Avatar answered Sep 25 '22 05:09

franklinsijo


s3cmd du -H s3://bucket-name

This command tells you the size of the bucket (human readable). If you want to know the sizes of subfolders you can list the folders in the bucket (s3cmd ls s3://bucket-name) and then iterate through them.

like image 29
Bruno Gonzalez Avatar answered Sep 24 '22 05:09

Bruno Gonzalez