I have a lot of images with different sizes (i.e. 1024x768 and 900x942) and an audio file (audio.mp3) of 30 seconds and I need to create a video from them.
I'm trying it now with: result%d.png (1 to 4) and audio.mp3
ffmpeg -y -i result%d.png -i audio.mp3 -r 30 -b 2500k -vframes 900
-acodec libvo_aacenc -ab 160k video.mp4
The video video.mp4 has 30 seconds but the 3 first images is showed very quickly when the last image remains until the end of the audio.
Each image needs to be showed in a equal time until the end of the audio. Anyone knows how to do it?
The number of the images will vary sometimes.
FFMPEG version: 3.2.1-1
UBUNTU 16.04.1
Imagine, you have an mp3 audio file named wow.mp3 In that case, the following command will get the duration of the mp3 in seconds.
ffprobe -v error -show_entries format=duration -of default=noprint_wrappers=1:nokey=1 wow.mp3
Once you have the duration in seconds (imagine I got 11.36 seconds). Now since I have 3 images, I want to run each image for (11.36/3 = 3.79), then please use the following:
ffmpeg -y -framerate 1/3.79 -start_number 1 -i ./swissGenevaLake_%d.jpg -i ./akib.mp3 -c:v libx264 -r 25 -pix_fmt yuv420p -c:a aac -strict experimental -shortest output.mp4
Here the images are ./swissGenevaLake_1.jpg, ./swissGenevaLake_2.jpg , and ./swissGenevaLake_3.jpg.
-framerate 1/3.784 means, each image runs for 3.784 seconds.
-start_number 1 means, starts with image number one, meaning ./swissGenevaLake_1.jpg
-c:v libx264: video codec H.264
-r 25: output video framerate 25
-pix_fmt yuv420p: output video pixel format.
-c:a aac: encode the audio using aac
-shortest: end the video as soon as the audio is done.
output.mp4: output file name
Disclaimer: I have not tested merging images of multiple sizes.
References:
https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
https://trac.ffmpeg.org/wiki/Encode/AAC
http://trac.ffmpeg.org/wiki/FFprobeTips
For creating video= n no of Image + Audio
Step 1)
You will create Video of these Images, as
Process proc = Runtime.getRuntime().exec(ffmpeg + " -y -r "+duration +" -i " + imagePath + " -c:v libx264 -r 15 -pix_fmt yuv420p -vf fps=90 " + imageVideoPath);
InputStream stderr = proc.getErrorStream();
InputStreamReader isr = new InputStreamReader(stderr);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
{
//System.out.println(line);
}
int exitVal = proc.waitFor();
proc.destroy();
Where duration=No of Images/Duration of Audio i.e in 1 sec you want how many Images
Step 2)
Process proc4VideoAudio = Runtime.getRuntime().exec(ffmpeg +" -i " + imageVideoPath + " -i "+ audioPath + " -map 0:0 -map 1:0 " + videoPath);
InputStream stderr1 = proc4VideoAudio.getErrorStream();
InputStreamReader isr1 = new InputStreamReader(stderr1);
BufferedReader br1 = new BufferedReader(isr1);
String line1 = null;
while ((line1 = br1.readLine()) != null)
{
//System.out.println(line1);
}
int exitVal1 = proc4VideoAudio.waitFor();
proc4VideoAudio.destroy();
Both Step 1 and Step 2 can be run in sequence now. If you want to do it manually then only run Runtime.getTime.exec(..)
The code below it is to make it synchronized.
** Also note the statement of FFMPEG to create video in one step from images and audio, gives you the same problem as mentioned by you and if not the solution will be static for fix number of images for given audio file.
These imagePath, VideoPath, audioPath are all Strings
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With