Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FFMPEG Can't Display The Duration Of a Video

I'm trying to use ffmpeg to capture frames from a video file, but I can't even get the duration of a video. everytime when I try to access it with pFormatCtx->duration I'm getting 0. I know the pointer initialized and contains the correct duration because if I use av_dump_format(pFormatCtx, 0, videoName, 0); then I actually get the duration data along with other information about the video. This is what I get when I use av_dump_format(pFormatCtx, 0, videoName, 0);:

Input #0, avi, from 'futurama.avi':
Duration: 00:21:36.28, start: 0.000000, bitrate: 1135 kb/s
Stream #0.0: Video: mpeg4 (Advanced Simple Profile), yuv420p, 512x384
[PAR 1:1 DAR 4:3], 25 tbr, 25 tbn, 25 tbc
Stream #0.1: Audio: ac3, 48000 Hz, stereo, s16, 192 kb/s 

I don't understand why av_dump_format can display duration and I can't. I checked the function definition, to display the duration, the function also uses pFormatCtx->duration. It's not just the duration other member variables also don't display the proper data when I call them in main.cpp

Here's my code:

extern "C" {
    #include<libavcodec/avcodec.h>
    #include<libavformat/avformat.h>
    #include<libswscale/swscale.h>
}

int main(int argc, char *argv[]) {
    AVFormatContext *pFormatCtx = NULL;

    const char videoName[] = "futurama.avi";

    // Register all formats and codecs.
    av_register_all();
    cout << "Opening the video file";
    // Open video file
    int ret = avformat_open_input(&pFormatCtx, videoName, NULL, NULL) != 0;
    if (ret != 0) {
        cout << "Couldn't open the video file." << ret ;
        return -1;
    }
    if(avformat_find_stream_info(pFormatCtx, 0) < 0) {
        cout << "problem with stream info";
        return -1;
    }

    av_dump_format(pFormatCtx, 0, videoName, 0);
    cout << pFormatCtx->bit_rate << endl; // different value each time, not initialized properly.
    cout << pFormatCtx->duration << endl; // 0
    return 0;
}

I don't know if it helps but, I use QtCreator on Ubuntu and linked the libraries statically.

like image 983
Malkavian Avatar asked Sep 24 '12 02:09

Malkavian


3 Answers

The duration property is in time_base units not milliseconds or seconds. The conversion to milliseconds is pretty easy,

double time_base =  (double)video_stream->time_base.num / (double)video_stream->time_base.den;
double duration = (double)video_stream->duration * time_base * 1000.0;

The duration is now in msec, just take the floor or ceil to get a whole number of msec, whichever you like.

like image 73
Tim Goss Avatar answered Sep 25 '22 08:09

Tim Goss


Difference between av_open_input_file() and avformat_open_input() is probably that the latter does not read stream information - hence duration is not initialized. Calling avformat_find_stream_info() fixed the issue for me.

I took the code snippet that calculates/displays from http://ffmpeg.org/doxygen/trunk/dump_8c_source.html#l00480 (note that line number can and probably will change in newer versions). And added some initialisation code, 'it works for me'. Hope it helps.

#include <libavutil/avutil.h>
#include <libavformat/avformat.h>

int main()
{
    const char const* file = "sample.mpg";
    AVFormatContext* formatContext = NULL;

    av_register_all();

    // Open video file
    avformat_open_input(&formatContext, file, NULL, NULL);
    avformat_find_stream_info(formatContext, NULL);

    // Lower log level since av_log() prints at AV_LOG_ERROR by default
    av_log_set_level(AV_LOG_INFO);

    av_log(NULL, AV_LOG_INFO, "  Duration: ");
    if (formatContext->duration != AV_NOPTS_VALUE) {
        int hours, mins, secs, us;
        int64_t duration = formatContext->duration + 5000;
        secs  = duration / AV_TIME_BASE;
        us    = duration % AV_TIME_BASE;
        mins  = secs / 60;   
        secs %= 60;          
        hours = mins / 60;   
        mins %= 60;
        av_log(NULL, AV_LOG_INFO, "%02d:%02d:%02d.%02d\n", hours, mins, secs, (100 * us) / AV_TIME_BASE);
    } 

    return 0;
}

To compile,

gcc -o duration -lavutil -lavformat duration.c
like image 32
Baris Demiray Avatar answered Sep 22 '22 08:09

Baris Demiray


How to get duration information (and more) from ffmpeg

I messed around with ffmpeg a while ago and found the learning curve to be pretty steep. So even though the OP asked this question months ago, I'll post some code in case others here on SO are looking to do something similar. The Open() function below is complete but has many asserts and lacks in the way of proper error handling.

Right off, one immediate difference I see is that I used av_open_input_file instead of avformat_open_input. I also didn't use av_dump_format.

Calculating the duration can be tricky, especially with H.264 and MPEG-2; see how durationSec is calculated below.

Note: This example also uses the JUCE C++ Utility Library.

Note2: This code is a modified version of the ffmpeg tutorial.

void VideoCanvas::Open(const char* videoFileName)
{       
    Logger::writeToLog(String(L"Opening video file ") + videoFileName);
    Close();

    AVCodec *pCodec;

    // register all formats and codecs
    av_register_all();  

    // open video file
    int ret = av_open_input_file(&pFormatCtx, videoFileName, NULL, 0, NULL);
    if (ret != 0) {
        Logger::writeToLog("Unable to open video file: " + String(videoFileName));
        Close();
        return;
    }

    // Retrieve stream information
    ret = av_find_stream_info(pFormatCtx);
    jassert(ret >= 0);

    // Find the first video stream
    videoStream = -1;
    audioStream = -1;
    for(int i=0; i<pFormatCtx->nb_streams; i++) {
        if (pFormatCtx->streams[i]->codec->codec_type==AVMEDIA_TYPE_VIDEO && videoStream < 0) {
            videoStream = i;            
        }
        if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO && audioStream < 0) {
            audioStream = i;
        }
    } // end for i
    jassert(videoStream != -1);
    jassert(audioStream != -1);

    // Get a pointer to the codec context for the video stream
    pCodecCtx=pFormatCtx->streams[videoStream]->codec;
    jassert(pCodecCtx != nullptr);

    /**
      * This is the fundamental unit of time (in seconds) in terms
      * of which frame timestamps are represented. For fixed-fps content,
      * timebase should be 1/framerate and timestamp increments should be
      * identically 1.
      * - encoding: MUST be set by user.
      * - decoding: Set by libavcodec.
      */
    AVRational avr = pCodecCtx->time_base;
    Logger::writeToLog("time_base = " + String(avr.num) + "/" + String(avr.den));

    /**
     * For some codecs, the time base is closer to the field rate than the frame rate.
     * Most notably, H.264 and MPEG-2 specify time_base as half of frame duration
     * if no telecine is used ...
     *
     * Set to time_base ticks per frame. Default 1, e.g., H.264/MPEG-2 set it to 2.
     */
    ticksPerFrame = pCodecCtx->ticks_per_frame;
    Logger::writeToLog("ticks_per_frame = " + String(pCodecCtx->ticks_per_frame));

    durationSec = static_cast<double>(pFormatCtx->streams[videoStream]->duration) * static_cast<double>(ticksPerFrame) / static_cast<double>(avr.den);
    double fH = durationSec / 3600.;
    int     H = static_cast<int>(fH);
    double fM = (fH - H) * 60.;
    int     M = static_cast<int>(fM);
    double fS = (fM - M) * 60.;
    int     S = static_cast<int>(fS);

    Logger::writeToLog("Video stream duration = " + String(H) + "H " + String(M) + "M " + String(fS, 3) + "S");

    // calculate frame rate based on time_base and ticks_per_frame
    frameRate = static_cast<double>(avr.den) / static_cast<double>(avr.num * pCodecCtx->ticks_per_frame);
    Logger::writeToLog("Frame rate = " + String(frameRate) );

    // audio codec context
    if (audioStream != -1) {
        aCodecCtx = pFormatCtx->streams[audioStream]->codec;

        Logger::writeToLog("Audio sample rate = " + String(aCodecCtx->sample_rate));
        Logger::writeToLog("Audio channels    = " + String(aCodecCtx->channels));       
    }
    jassert(aCodecCtx != nullptr);

    // format:
    // The "S" in "S16SYS" stands for "signed", the 16 says that each sample is 16 bits long, 
    // and "SYS" means that the endian-order will depend on the system you are on. This is the
    // format that avcodec_decode_audio2 will give us the audio in.

    // open the audio codec
    if (audioStream != -1) {
        aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
        if (!aCodec) {
            Logger::writeToLog(L"Unsupported codec ID = " + String(aCodecCtx->codec_id) );
            Close();
            return;  // TODO: should we just play video if audio codec doesn't work?
        }
        avcodec_open(aCodecCtx, aCodec);
    }


    // Find the decoder for the video stream
    pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
    if(pCodec == nullptr) {
        jassert(false);
        // fprintf(stderr, "Unsupported codec!\n");
        //return -1; // Codec not found
    }

    // Open video codec
    ret = avcodec_open(pCodecCtx, pCodec);
    jassert(ret >= 0);

    // Allocate video frame
    pFrame=avcodec_alloc_frame();
    jassert(pFrame != nullptr);

    // Allocate an AVFrame structure
    pFrameRGB=avcodec_alloc_frame();
    jassert(pFrameRGB != nullptr);

    int numBytes = avpicture_get_size(PIX_FMT_RGB32, pCodecCtx->width, pCodecCtx->height);
    jassert(numBytes != 0);
    buffer=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
    jassert(buffer != nullptr);

    // note: the pixel format here is RGB, but sws_getContext() needs to be PIX_FMT_BGR24 to match (BGR)
    // this might have to do w/ endian-ness....make sure this is platform independent
    if (m_image != nullptr) delete m_image;
    m_image = new Image(Image::ARGB, pCodecCtx->width, pCodecCtx->height, true);

    int dstW = pCodecCtx->width; // don't rescale
    int dstH = pCodecCtx->height;
    Logger::writeToLog(L"Video width = " + String(dstW));
    Logger::writeToLog(L"Video height = " + String(dstH));

    // this should only have to be done once
    img_convert_ctx = sws_getContext(pCodecCtx->width, pCodecCtx->height, pCodecCtx->pix_fmt, dstW, dstH, PIX_FMT_RGB32, SWS_FAST_BILINEAR, NULL, NULL, NULL);
    jassert(img_convert_ctx != nullptr);  

    setSize(pCodecCtx->width, pCodecCtx->height);

} // Open()
like image 42
Matthew Kraus Avatar answered Sep 21 '22 08:09

Matthew Kraus