I am using the fluent-ffmpeg library with node.js to transcode videos originally in a flash movie format to the mp3 format with multiple resolutions, 1080p, etc.. Once the transcoding is complete, I would like to move the transcoded video to an s3 bucket.
I pull the original .flv file from a source s3 bucket and pass the stream to the ffmpeg constructor function. The issue is after the transcoding completes, how do I then get the stream of the mp4 data to send to s3.
Here is the code I have so far:
var params = {
Bucket: process.env.SOURCE_BUCKET,
Key: fileName
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
var format = ffmpeg(data)
.size('854x480')
.videoCodec('libx264')
.format('flv')
.toFormat('mp4');
.on('end', function () {
//Ideally, I would like to do the uploading here
var params = {
Body: //{This is my confusion, how do I get the stream to add here?},
Bucket: process.env.TRANSCODED_BUCKET,
Key: fileName
};
s3.putObject(params, function (err, data) {
});
})
.on('error', function (err) {
console.log('an error happened: ' + err.message);
});
});
For the code above, where can I get the transcoded stream to add to the "Body" property of the params object?
Update:
Here is a revision of what I am trying to do:
var outputStream: MemoryStream = new MemoryStream();
var proc = ffmpeg(currentStream)
.size('1920x1080')
.videoCodec('libx264')
.format('avi')
.toFormat('mp4')
.output(outputStream)
// setup event handlers
.on('end', function () {
uploadFile(outputStream, "").then(function(){
resolve();
})
})
.on('error', function (err) {
console.log('an error happened: ' + err.message);
});
I would like to avoid copying the file to the local filesystem from s3, rather I would prefer to process the file in memory and upload back to s3 when finished. Would fluent-ffmpeg allow this scenario?
Introduction to Cloud Computing on AWS for Beginners [2022] Amazon S3 service is used for file storage, where you can upload or remove files. We can trigger AWS Lambda on S3 when there are any file uploads in S3 buckets. AWS Lambda has a handler function which acts as a start point for AWS Lambda function.
However, S3 is an object storage system, and it can't be really mounted on an instance like you would do with NFS or EBS storage solutions. But with s3fs-fuse you can mimic such a behavior. And for some use-cases it will be sufficient.
You don't seem to be saving the output of the transcoding anywhere.
.flv
file) using output
to your local filesystem.Provide your new file's contents to putObject
. According to the putObject
documentation, the Body
parameter accepts:
Body
— (Buffer
,Typed Array
,Blob
,String
,ReadableStream
) Object data.
Here's some revised sample code:
// Generate a filename for the `.flv` version
var flvFileName = fileName.substring(0, fileName.length - path.extname(fileName).length) + '.flv';
// Perform transcoding, save new video to new file name
var format = ffmpeg(data)
.size('854x480')
.videoCodec('libx264')
.format('flv')
.toFormat('mp4');
.output(flvFileName)
.on('end', function () {
// Provide `ReadableStream` of new video as `Body` for `pubObject`
var params = {
Body: fs.createReadStream(flvFileName)
Bucket: process.env.TRANSCODED_BUCKET,
Key: flvFileName
};
s3.putObject(params, function (err, data) {
});
})
Note: You may be able to create an output stream from fluent-ffmpeg
and upload that stream to AWS S3, if you prefer, but this will complicate the logic and error handling.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With