I'm trying to get a list of objects in a bucket into an organised list, with folders and files.
In the documentation, S3 Listing Keys Hierarchically Using Prefix and Delimiter, Amazon states that when there are other directories in the currently selected directory:
Amazon S3 groups these keys and return a single CommonPrefixes element
I have the following code:
$iterator = $client->getIterator(
'ListObjects',
array(
'Bucket' => $bucket,
'Prefix' => $dir,
'Delimiter' => '/'
)
);
which does a good job. It gets all the objects in a bucket (unlike the low-level ListObjects
method, which returns a max of 1000 objects) and displays them alphabetically.
If I tell this iterator to return an array
like so:
$objects = $iterator->toArray();
I can then use this array
in a foreach
loop
foreach ($objects as $object) {
echo $object['Key'] . "<br/>\n";
}
which gives me all the pointers to files:
It isn't showing the other directories in this directory though.
Calling $iterator->get('CommonPrefixes')
returns null
. Any other operations on the $iterator
variable crash the code.
How can I get access to the full set of responses in the ListBucketResult
?
The getIterator()
method has a third parameter for options specific to the iterator object. The S3 ListObjects iterator is actually a concrete class and specifies some custom options in its docs: return_prefixes
, sort_results
, and names_only
(see http://docs.aws.amazon.com/aws-sdk-php/latest/class-Aws.S3.Iterator.ListObjectsIterator.html). Use return_prefixes
to get the CommonPrefixes
intermingled with the objects. Try this:
$iterator = $client->getIterator(
'ListObjects',
array(
'Bucket' => $bucket,
'Prefix' => $dir,
'Delimiter' => '/'
),
array(
'return_prefixes' => true,
),
);
foreach ($iterator as $object) {
if (isset($object['Prefix'])) {
// For Common Prefixes
echo $object['Prefix'] . "<br/>\n";
} else {
// For Objects
echo $object['Key'] . "<br/>\n";
}
}
// Also Licensed under version 2.0 of the Apache License.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With