I'm trying to concatenate two videos with ffmpeg. Nothing fancy; I just want one video that consists of video A immediately followed by video B.
I've tried the code from How to concatenate (join, merge) media files on a freshly built and otherwise-working-fine install of ffmpeg 1.2.1 on Fedora 17, but the following error message appears:
$ ffmpeg -i video_a.mov -i video_b.mov -filter_complex '[0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a]' -map '[v]' -map '[a]' output.mp4
ffmpeg version N-54271-g7f866c1 Copyright (c) 2000-2013 the FFmpeg developers
built on Jun 29 2013 11:05:42 with gcc 4.7.2 (GCC) 20120921 (Red Hat 4.7.2-2)
configuration: --enable-gpl --enable-nonfree --enable-pthreads --enable-libx264 --enable-libfaac --extra-cflags=-I/usr/local/include --extra-ldflags=-L/usr/local/lib
libavutil 52. 37.101 / 52. 37.101
libavcodec 55. 17.100 / 55. 17.100
libavformat 55. 10.100 / 55. 10.100
libavdevice 55. 2.100 / 55. 2.100
libavfilter 3. 77.101 / 3. 77.101
libswscale 2. 3.100 / 2. 3.100
libswresample 0. 17.102 / 0. 17.102
libpostproc 52. 3.100 / 52. 3.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'v1221-MTQxMzcyNTIxODU2.mov':
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2013-03-28 20:34:59
encoder : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
encoder-eng : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
Duration: 00:00:05.34, start: 0.000000, bitrate: 15837 kb/s
Stream #0:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
Metadata:
creation_time : 2013-03-28 20:34:59
handler_name : Core Media Data Handler
Stream #0:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15512 kb/s, 29.81 fps, 30 tbr, 600 tbn, 1200 tbc
Metadata:
creation_time : 2013-03-28 20:34:59
handler_name : Core Media Data Handler
Input #1, mov,mp4,m4a,3gp,3g2,mj2, from 'v1224-MTQxMzcyNTIxODg5.mov':
Metadata:
major_brand : qt
minor_version : 0
compatible_brands: qt
creation_time : 2013-03-28 20:36:28
encoder : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
encoder-eng : Mac OS X v10.8.3 (CMA 914, CM 926.87, x86_64)
Duration: 00:00:04.13, start: 0.000000, bitrate: 15689 kb/s
Stream #1:0(und): Audio: aac (mp4a / 0x6134706D), 44100 Hz, mono, fltp, 221 kb/s
Metadata:
creation_time : 2013-03-28 20:36:28
handler_name : Core Media Data Handler
Stream #1:1(und): Video: h264 (Main) (avc1 / 0x31637661), yuv420p, 1920x1080 [SAR 1:1 DAR 16:9], 15446 kb/s, 29.79 fps, 30 tbr, 600 tbn, 1200 tbc
Metadata:
creation_time : 2013-03-28 20:36:28
handler_name : Core Media Data Handler
Stream specifier ':0' in filtergraph description [0:0] [0:1] [1:0] [1:1] concat=n=2:v=1:a=1 [v] [a] matches no streams.
A few other things to note:
filter_complex
. Even on ones that were cited as working, I get the "matches no streams" message.mp4 and file2. mp4 . You can concatenate these files using the concat demuxer (documentation) easily if their properties match. That is, they have the same height, width, pixel formats, codecs, etc.
run(new String[] { "ffmpeg", "-i", "'concat:" + ts1 + "|" + ts2 + "'", "-vcodec", "copy", "-acodec", "copy", "-absf", "aac_adtstoasc", output });
For each input, specify the video stream first and then the audio stream(s). Because your video stream is stream 1 for each of your inputs, and your audio stream is stream 0, that would be:
ffmpeg -i video_a.mov -i video_b.mov -filter_complex '[0:1] [0:0] [1:1] [1:0]
concat=n=2:v=1:a=1 [v] [a]' -map '[v]' -map '[a]' output.mp4
Or better yet, the following command should work regardless of the order of the original streams, and will take the first audio stream if there is more than one:
ffmpeg -i video_a.mov -i video_b.mov -filter_complex '[0:v] [0:a:0] [1:v] [1:a:0]
concat=n=2:v=1:a=1 [v] [a]' -map '[v]' -map '[a]' output.mp4
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With