A user can download query results in CSV format. The file is small (a few KB), but the contents are important.
The first approach is to use php output buffer php://
:
$callback = function() use ($result, $columns) {
$file = fopen('php://output', 'w');
fputcsv($file, $columns);
foreach($result as $res) {
fputcsv($file, array($res->from_user, $res->to_user, $res->message, $res->date_added));
}
fclose($file);
};
return response()->stream($callback, 200, $headers);
The second approach is to create a new folder in laravels storage system and set it to private and download the file from there. You could even delete the file after the download:
'csv' => [
'driver' => 'local',
'root' => storage_path('csv'),
'visibility' => 'private',
],
Here is the create/download code:
$file = fopen('../storage/csv/file.csv', 'w');
fputcsv($file, $columns);
foreach($result as $res) {
fputcsv($file, array($res->from_user, $res->to_user, $res->message, $res->date_added));
}
fclose($file);
return response()->make(Storage::disk('csv')->get('file.csv'), 200, $headers);
This return will instantly delete the file after the download:
return response()->download(Storage::disk('csv')->path('file.csv'))
->deleteFileAfterSend(true);
What would be more secure? What is the better approach? I am currently leaning towards the second approach with the storage.
Option 1
Reasons:
If you were dealing with files that are tens/hundreds of MB, I'd be thinking differently...
Let's think about all options,
Option 1 is good solution because you are not storing the file. It will be more secure than others. But timeout can be a problem at high traffic.
Option 2 also good solution with delete. But you need to create files with unique names so you can use parallel downloads.
Option 3 is like option 2 but if you are using laravel don't use it. (And think about 2 people are downloading at the same time)
After this explanation, you need to work on option 1 to make it more secure if you are using one server. But if you are using microservices you need work on option 2.
I can suggest one more thing to make it secure. Create a unique hashed URL. For example, use timestamp and hash it with laravel and check them before URL. So people can't download again from download history.
https://example.com/download?hash={crypt(timestamp+1min)}
If it is not downloaded in 1 min URL will be expired.
I think the reply depends on the current architecture and the size of file to download
(1)st approach is applicable when:
If the files are small (less then 10 Mb) Thanks @tanerkay
you have simple architecture (e.g. 1 server)
Reasons:
no download failures -- no need to retry
keep it simple
no files = no backups and no rsync and no additional places to steal it
. . .
(2)nd approach is applicable when:
If your files are big (10+ Mb)
If you already have microservices architecture with multiple balance loaders -- keep the similarity
If you have millions users trying to download -- you just can't service them without balance loader and parallel downloading
Reasons:
The second approach is definitely more SCALABLE and so more stable under high loading, so more secure. Microservices are more time consuming and more scalable for heavy loading.
The usage of separate file storage allows you in the future to have separate file server and balance loaded and the queue manager and separate dedicated access control.
If the content is important, it usually means that to get it is very important for the user. But direct output with headers can hang or get a timeout error and so on. Keeping the file until it would be downloaded is much more sure approach of delivering it I think.
Still, I consider expiration time instead or additionally to the downloading fact -- the download process can fail, or the file is lost (ensure 1+ hour availability) or vice versa the user will try to download it only after 1 year or never -- why should you keep this file for more than N days?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With