I'm looking for a solution to recursively get the size of all my folders on a Amazon S3 bucket which has a lot of embedded folders.
The perfect example is the Linux du --si
command:
12M ./folder1
50M ./folder2
50M ./folder2/subfolder1
etc...
I'm also open to any graphical tool. Is there any command or AWS API for that?
This section describes how to download and run scripts from Amazon Simple Storage Service (Amazon S3). You can run different types of scripts, including Ansible Playbooks, Python, Ruby, Shell, and PowerShell. You can also download a directory that includes multiple scripts.
The following cp command uploads a local file stream from standard input to a specified bucket and key: aws s3 cp - s3://mybucket/stream.txt. Uploading a local file stream that is larger than 50GB to S3. The following cp command uploads a 51GB local file stream from standard input to a specified bucket and key.
Use awscli
aws s3 ls s3://bucket --recursive --human-readable --summarize
s3cmd du -H s3://bucket-name
This command tells you the size of the bucket (human readable). If you want to know the sizes of subfolders you can list the folders in the bucket (s3cmd ls s3://bucket-name) and then iterate through them.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With