I am trying to get all the files from a amazon S3 buckets sub folder and make them downloadable in a web page.
I have a bucket called images. Inside that bucket I have some other folders. Now I am trying to get all the files inside that subfolder and show it in a page. S3 Buckets: /images /images/test1/ /images/test2/ /images/test1/1 /images/test1/2 /images/test1/1/item
I have tried like this, but couldn't get the expected result.
// Target files full path : images/test1/1/item
$bucketName = 'images';
$source = '/test1/1/item'
$image = $this->s3->getBucket($bucketName);
foreach ($image as $key=>$data){
$k = $data['name'];
print_r($k); // now this gives me full list of everything inside the images bucket.
}
Now if I want to get objects from the bucket, I can use getObject, which I tried like this:
$result = $this->s3->getObject($image); // I am confused about this one
Any suggestions will be appreciated. Thanks,
After using the following answer from Josue Ibarra, I got something like this
(
[tableuploads/1/emaillists/15.10.2013-18.03.23-emailList.csv] => Array
(
[file_name] => 15.10.2013-18.03.23-emailList.csv
[file_folder] => emaillists
[file_size] => 64
[created_on] => 2013-10-15 19:03:26
[s3_link] => http://s3.amazonaws.com/webtools_sharing/emaillists/15.10.2013- 18.03.23-emailList.csv
[md5_hash] => 4809ae0b75d3517b69f69b53ba0b2959
)
[tableuploads/1/emaillists/15.10.2013-18.04.32-emailList.csv] => Array
(
[file_name] => 15.10.2013-18.04.32-emailList.csv
[file_folder] => emaillists
[file_size] => 64
[created_on] => 2013-10-15 19:04:45
[s3_link] => http://s3.amazonaws.com/webtools_sharing/emaillists/15.10.2013-18.04.32-emailList.csv
[md5_hash] => 14094e133779619ddfcfc008d16ce75b
)
[tableuploads/2/emaillists/15.10.2013-18.03.23-emailList.csv] => Array
(
[file_name] => 15.10.2013-18.03.23-emailList.csv
[file_folder] => emaillists
[file_size] => 64
[created_on] => 2013-10-15 19:03:26
[s3_link] => http://s3.amazonaws.com/webtools_sharing/emaillists/15.10.2013-18.03.23-emailList.csv
[md5_hash] => 4809ae0b75d3517b69f69b53ba0b2959
)
[tableuploads/2/emaillists/15.10.2013-18.04.32-emailList.csv] => Array
(
[file_name] => 15.10.2013-18.04.32-emailList.csv
[file_folder] => emaillists
[file_size] => 64
[created_on] => 2013-10-15 19:04:45
[s3_link] => http://s3.amazonaws.com/webtools_sharing/emaillists/15.10.2013-18.04.32-emailList.csv
[md5_hash] => 14094e133779619ddfcfc008d16ce75b
)
)
I want something like this: http://s3.amazonaws.com/webtools_sharing/tableuploads/1/emaillists/15.10.2013-18.03.23- emailList.csv
http://s3.amazonaws.com/webtools_sharing/tableuploads/2/emaillists/15.10.2013-18.03.23-emailList.csv
Which needs to be a downloadable link.
Please suggest me to a link or give me some ideas how can I do it.
I made this function a while back
public function list_s3_bucket($bucket_name)
{
// initialize the data array
$data;
$bucket_content = $this->s3->getBucket($bucket_name);
foreach ($bucket_content as $key => $value) {
// ignore s3 "folders"
if (preg_match("/\/$/", $key)) continue;
// explode the path into an array
$file_path = explode('/', $key);
$file_name = end($file_path);
$file_folder = substr($key, 0, (strlen($file_name) * -1)+1);
$file_folder = prev($file_path);
$s3_url = "https://s3.amazonaws.com/{$bucket_name}/{$key}";
$data[$key] = array(
'file_name' => $file_name,
's3_key' => $key,
'file_folder' => $file_folder,
'file_size' => $value['size'],
'created_on' => date('Y-m-d H:i:s', $value['time']),
's3_link' => $s3_url,
'md5_hash' => $value['hash']);
}
return $data;
}
It returns you an array, the key is the full file name, so you can do:
$list = $this->your_model->list_s3_bucket($bucket_name);
foreach ($list as $key => $row){
force_download($this->s3->getObject($row['s3_key']), $row['file_name']);
// you can use this url:
print($row['s3_link']);
}
function force_download($data, $file_name)
{
header("Content-type: application/octet-stream");
header("Content-Disposition: attachment; filename=\"{$file_name}\"");
echo $data;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With