I am trying to integrate S3 file storage into my NodeJS application. This tutorial explaining how to upload directly to S3 is very good, but it's not suitable for my needs, as I want the files to only be accessible via my web app's API. I don't want the files to be publicly available at their S3 URLs, I just want them to be available through for example /api/files/<user_id>/<item_id>/<filename>
.
The reason I want downloads to go through my API is so that I can check that the user is permitted to view this particular file.
The reason I want uploads to go through my server is so that I know which <item_id>
to assign the filename, as this will be the same as its MongoDB _id
property. I can't do this if I upload the file to S3 before the item has a Mongo _id
in the first place.
I've looked but couldn't find a straightforward tutorial for how to stream files from S3 to the client and vice versa through my NodeJS application.
Thank you
A combination of an express middleware (to check the authorization of the user making the request) and the use of the Node AWS SDK should do the trick.
Here is a full example using multer for the upload.
var express = require('express');
var app = express();
var router = express.Router();
var multer = require('multer');
var upload = multer({
dest: "tmp/"
});
var fs = require('fs');
var async = require('async');
var AWS = require('aws-sdk');
// Configure AWS SDK here
var s3 = new AWS.s3({
params: {
Bucket: 'xxx'
}
});
/**
* Authentication middleware
*
* It will be called for any routes starting with /files
*/
app.use("/files", function (req, res, next) {
var authorized = true; // use custom logic here
if (!authorized) {
return res.status(403).end("not authorized");
}
next();
});
// Route for the upload
app.post("/files/upload", upload.single("form-field-name"), function (req, res) {
var fileInfo = console.log(req.file);
var fileStream = fs.readFileSync(fileInfo.path);
var options = {
Bucket: 'xxx',
Key: 'yyy/'+fileName,
Body: fileStream
};
s3.upload(options, function (err) {
// Remove the temporary file
fs.removeFileSync("tmp/"+fileInfo.path); // ideally use the async version
if (err) {
return res.status(500).end("Upload to s3 failed");
}
res.status(200).end("File uploaded");
});
});
// Route for the download
app.get("/files/download/:name", function (req, res) {
var fileName = req.params.name;
if (!fileName) {
return res.status(400).end("missing file name");
}
var options = {
Bucket: 'xxx',
Key: 'yyy/'+fileName
};
res.attachment(fileName);
s3.getObject(options).createReadStream().pipe(res);
});
app.listen(3000);
Obviously this is only partially tested and lacks of proper error handling - but it hopefully it should give you a rough idea of how to implement it.
You can use S3 REST API. It will allows you to make signed request to GET or PUT your bucket's objects directly from your backend.
The principle is similar from the one described in your link. Your backend need to use the AWS JS SDK to create a signed URL to manipulate an object. You are free to do any check you want prior or after requesting something from S3 in your Express routes.
Here is a simple GET example (it is not fully functional, just the main idea):
...
[assume that you are in an express route with req/res objects]
...
var aws = require('aws-sdk'),
s3 = new aws.S3();
aws.config.region = 'your_region';
aws.config.credentials = {
accessKeyId: 'your_key',
secretAccessKey: 'your_secret'
};
s3.getSignedUrl('getObject', {Bucket: 'your_bucket', Key: 'your_file_name', Expires: 3600}, function (error, url) {
if (error || !url) {
//error while creating the URL
res.status(500).end();
} else {
//make a request to the signed URL to get the file and pipe the res to the client
request({
url: url
}).pipe(res);
}
});
You will here find more examples from Amazon.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With