I'm building a react native app for both Android and IOS, the back-end API is written with NodeJS.
Users may upload video from their phones, once uploaded the user and their friends will be able to view the video - so the videos need to be stored in a format which is playable on both Android & IOS.
My question relates to the conversion of video, uploaded by the user. I developed a similar app a couple of years ago; I used the repo node-fluent-ffmpeg which provides a nice API to interact with FFmpeg.
In the previous project (which was a web app), I converted the uploaded videos into two files, one .mp4 and one .webm - if a user uploaded an mp4, then I would skip the mp4 step, likewise if they uploaded a .webm.
This was kind of slow. Now I've come across the same requirement years later, after some research I think I was wrong to convert the videos to the last project.
I've read that I can simply use FFmpeg to change the container format of the videos, which is a much faster process than converting them from scratch.
The video conversion code I used last time went something along the lines of:
var convertVideo = function (source, format, output, success, failure, progress) {
var converter = ffmpeg(source);
var audioCodec = "libvorbis";
if (format.indexOf("mp4") != -1) {
audioCodec = "aac";
}
converter.format(format)
.withVideoBitrate(1024)
.withAudioCodec(audioCodec)
.on('end', success)
.on('progress', progress)
.on('error', failure);
converter.save(output);
};
Usage:
Convert to mp4:
convertVideo("PATH_TO_VIDEO", "mp4", "foo.mp4", () => {console.log("success");});
Convert to webm:
convertVideo("PATH_TO_VIDEO", "webm", "foo.webm", () => {console.log("success");});
Can anyone point out a code smell here regarding the performance of this operation? Is this code doing a lot more than it should achieve cross-platform compatibility between IOS and Android?
Might be worth mentioning that support for older OS versions is not such a big deal in this project.
What is the difference between codec and container/format?
You should understand the difference between a codec (e.g. H.264, VP9) and a container format (e.g. MP4, WebM). The container just stores the encoded video and audio information. Usually, you can change between containers by streamcopying (ffmpeg -i input -c copy output
), but for historic reasons, you'll find that some containers don't accept some codecs, or that some players may not handle a codec within a container (e.g. only recent software will be able to read VP9 video in MP4). Have a look at this overview of container formats to see which codecs are supported.
What are the constraints imposed by different mobile OSes?
For targeting iOS and Android platforms, you then need to check if a given video file is compatible with the supported codecs / formats:
Thee may change over time, of course, but generally, the common denominator is:
The specific constraints depend on the device, obviously, and the operating system version installed. Not all of these specifics are mentioned in the iOS/Android documentation. You should definitely do a few trials, and, if unsure, re-encode the video.
So, what codec/format should I encode with?
Apple has invested a lot in the MPEG ecosystem and traditionally has better support for H.264 and H.265 (HEVC); they don't support VP8 and VP9 in WebM. Thus, if you have a VP8/VP9 video, and you want it to be viewable cross-platform, re-encode it to H.264.
How should I do the actual encoding?
Make sure you use a high enough bitrate to not add further artifacts to already lossy video. You should not just do a one-pass target bitrate encode as you're doing now. Instead, do two-pass encoding to increase the quality and efficiency of the encode (although it takes longer). You can also use a constant quality mode if you do not care about a particular file size (such as CRF for libx264). Read the FFmpeg H.264 encoding guide for more info.
What about the future?
Note that almost all the big players in the tech industry—except Apple—have joined the Alliance for Open Media. They're developing the successor to VP9, called “AV1”, which will gain support in all major browser vendors (Chrome, Firefox, Edge) and Android.
H.265 / HEVC seems like a good choice too, but encoding with x265
, for example, is currently still very slow compared to x264
, the most popular open-source H.264 encoder.
WebM and mp4 are only containers, not the actual encoders being used. Typically webm will be vp8 or vp9, and mp4 will most of the time be h264. You can change containers with a stream copy (ffmpeg command: -vcodec copy) but you can't change encoder type without encoding the entire stream. When you run a copy command, you also cannot resize the video, or change the bitrate. Copy is exactly how it sounds, you are copying the underlying frames exactly and wrapping them in a different container.
I will question why you want to encode to webm. Both ios and android will play (most all) mp4 videos without issue. If you want to enforce a certain format type, then you could check if the incoming video and try to enforce certain standards (ex. h264 baseline video and aac audio, no larger than 1080p). If an incoming stream does not conform to this standard, encode the video, specifying encoder, bitrate, and size. If it does fit within your standards, just run -vcodec copy -acodec copy. It looks like your specific library would be .audioCodec('copy') .videoCodec('copy').
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With