I have to transcode videos from webm to mp4 when they're uploaded to firebase storage. I have a code demo here that works, but if the uploaded video is too large, firebase functions will time out on me before the conversion is finished. I know it's possible to increase the timeout limit for the function, but that seems messy, since I can't ever confirm the process will take less time than the timeout limit.
Is there some way to stop firebase from timing out without just increasing the maximum timeout limit?
If not, is there a way to complete time consuming processes (like video conversion) while still having each process start using firebase function triggers?
If even completing time consuming processes using firebase functions isn't something that really exists, is there some way to speed up the conversion of fluent-ffmpeg without touching the quality that much? (I realize this part is a lot to ask. I plan on lowering the quality if I absolutely have to, as the reason webms are being converted to mp4 is for IOS devices)
For reference, here's the main portion of the demo I mentioned. As I said before, the full code can be seen here, but this section of the code copied over is the part that creates the Promise that makes sure the transcoding finishes. The full code is only 70 something lines, so it should be relatively easy to go through if needed.
const functions = require('firebase-functions');
const mkdirp = require('mkdirp-promise');
const gcs = require('@google-cloud/storage')();
const Promise = require('bluebird');
const ffmpeg = require('fluent-ffmpeg');
const ffmpeg_static = require('ffmpeg-static');
(There's a bunch of text parsing code here, followed by this next chunk of code inside an onChange event)
function promisifyCommand (command) {
return new Promise( (cb) => {
command
.on( 'end', () => { cb(null) } )
.on( 'error', (error) => { cb(error) } )
.run();
})
}
return mkdirp(tempLocalDir).then(() => {
console.log('Directory Created')
//Download item from bucket
const bucket = gcs.bucket(object.bucket);
return bucket.file(filePath).download({destination: tempLocalFile}).then(() => {
console.log('file downloaded to convert. Location:', tempLocalFile)
cmd = ffmpeg({source:tempLocalFile})
.setFfmpegPath(ffmpeg_static.path)
.inputFormat(fileExtension)
.output(tempLocalMP4File)
cmd = promisifyCommand(cmd)
return cmd.then(() => {
//Getting here takes forever, because video transcoding takes forever!
console.log('mp4 created at ', tempLocalMP4File)
return bucket.upload(tempLocalMP4File, {
destination: MP4FilePath
}).then(() => {
console.log('mp4 uploaded at', filePath);
});
})
});
});
Time Limits60 minutes for HTTP functions. 10 minutes for event-driven functions.
In Cloud Functions (2nd gen), the maximum timeout duration is 60 minutes (3600 seconds) for HTTP functions and 9 minutes (540 seconds) for event-driven functions.
By default, the memory allocated for a function is 256MB or 256 MiB depending on the Cloud Functions product version.
By default each Cloud Run container instance can receive up to 80 requests at the same time; you can increase this to a maximum of 1000.
Cloud Functions for Firebase is not well suited (and not supported) for long-running tasks that can go beyond the maximum timeout. Your only real chance at using only Cloud Functions to perform very heavy compute operations is to find a way to split up the work into multiple function invocations, then join the results of all that work into a final product. For something like video transcoding, that sounds like a very difficult task.
Instead, consider using a function to trigger a long-running task in App Engine or Compute Engine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With