Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to list all files in an S3 folder using Fog in Ruby

Tags:

How do I list all the files in a specific S3 "directory" using Fog?

I know that S3 doesn't store files in folders but I need a way to limit the returned files to specific "folder" instead of retrieving the entire list in the bucket.

like image 602
Gerry Shaw Avatar asked Apr 11 '13 18:04

Gerry Shaw


People also ask

How do I see how many files are in a S3 bucket?

Open the AWS S3 console and click on your bucket's name. In the Objects tab, click the top row checkbox to select all files and folders or select the folders you want to count the files for. Click on the Actions button and select Calculate total size.

How can you download an S3 bucket including all folders and files?

aws s3 sync s3://mybucket . will download all the objects in mybucket to the current directory. This will download all of your files using a one-way sync. It will not delete any existing files in your current directory unless you specify --delete , and it won't change or delete any files on S3.

What is _$ folder in S3?

The "_$folder$" files are placeholders. Apache Hadoop creates these files when you use the -mkdir command to create a folder in an S3 bucket. Hadoop doesn't create the folder until you PUT the first object. If you delete the "_$folder$" files before you PUT at least one object, Hadoop can't create the folder.


1 Answers

Use the prefix option on the directory.get method. Example:

def get_files(path, options)   connection = Fog::Storage.new(     provider: 'AWS',     aws_access_key_id: options[:key],     aws_secret_access_key: options[:secret]   )   connection.directories.get(options[:bucket], prefix: path).files.map do |file|     file.key   end end 
like image 179
Gerry Shaw Avatar answered Oct 28 '22 08:10

Gerry Shaw