Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creating GIF from QImages with ffmpeg

Tags:

c++

ffmpeg

gif

qt



I would like to generate GIF from QImage, using ffmpeg - all of that programmatically (C++). I’m working with Qt 5.6 and the last build of ffmpeg (build git-0a9e781 (2016-06-10).

I’m already able to convert these QImage in .mp4 and it works. I tried to use the same principle for the GIF, changing format pixel and codec. GIF is generated with two pictures (1 second each), in 15 FPS.

## INITIALIZATION
#####################################################################

// Filepath : "C:/Users/.../qt_temp.Jv7868.gif"  
// Allocating an AVFormatContext for an output format...
avformat_alloc_output_context2(formatContext, NULL, NULL, filepath);

...

// Adding the video streams using the default format codecs and initializing the codecs.
stream = avformat_new_stream(formatContext, *codec);

AVCodecContext * codecContext = avcodec_alloc_context3(*codec);

context->codec_id       = codecId;
context->bit_rate       = 400000;
...
context->pix_fmt        = AV_PIX_FMT_BGR8;

...

// Opening the codec...
avcodec_open2(codecContext, codec, NULL);

...

frame = allocPicture(codecContext->width, codecContext->height, codecContext->pix_fmt);
tmpFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_RGBA);

...

avformat_write_header(formatContext, NULL);

## ADDING A NEW FRAME
#####################################################################

// Getting in parameter the QImage: newFrame(const QImage & image)
const qint32 width  = image.width();
const qint32 height = image.height();

// Converting QImage into AVFrame
for (qint32 y = 0; y < height; y++) {
    const uint8_t * scanline = image.scanLine(y);

    for (qint32 x = 0; x < width * 4; x++) {
        tmpFrame->data[0][y * tmpFrame->linesize[0] + x] = scanline[x];
    }
}

...

// Scaling...
if (codec->pix_fmt != AV_PIX_FMT_BGRA) {
    if (!swsCtx) {
        swsCtx = sws_getContext(codec->width, codec->height,
                                AV_PIX_FMT_BGRA,
                                codec->width, codec->height,
                                codec->pix_fmt,
                                SWS_BICUBIC, NULL, NULL, NULL);
    }

    sws_scale(swsCtx,
              (const uint8_t * const *)tmpFrame->data,
              tmpFrame->linesize,
              0,
              codec->height,
              frame->data,
              frame->linesize);
}
frame->pts = nextPts++;

...

int gotPacket = 0;
AVPacket packet = {0};

av_init_packet(&packet);
avcodec_encode_video2(codec, &packet, frame, &gotPacket);

if (gotPacket) {
    av_packet_rescale_ts(paket, *codec->time_base, stream->time_base);
    paket->stream_index = stream->index;

    av_interleaved_write_frame(formatContext, paket);
}

But when I’m trying to modify the video codec and pixel format to match with GIF specifications, I’m facing some issues. I tried several codecs such as AV_CODEC_ID_GIF and AV_CODEC_ID_RAWVIDEO but none of them seem to work. During the initialization phase, avcodec_open2() always returns such kind of errors:

Specified pixel format rgb24 is invalid or not supported
Could not open video codec:  gif

EDIT 17/06/2016

Digging a little bit more, avcodec_open2() returns -22:

#define EINVAL          22      /* Invalid argument */

EDIT 22/06/2016

Here are the flags used to compile ffmpeg:

"FFmpeg/Libav configuration: --disable-static --enable-shared --enable-gpl --enable-version3 --disable-w32threads --enable-nvenc --enable-avisynth --enable-bzlib --enable-fontconfig --enable-frei0r --enable-gnutls --enable-iconv --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libfreetype --enable-libgme --enable-libgsm --enable-libilbc --enable-libmodplug --enable-libmfx --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopenjpeg --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvo-amrwbenc --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx264 --enable-libx265 --enable-libxavs --enable-libxvid --enable-libzimg --enable-lzma --enable-decklink --enable-zlib"

Did I miss a crucial one for GIF?

EDIT 27/06/2016

Thanks to Gwen, I have a first output: I setted the context->pix_fmt to AV_PIX_FMT_BGR8. Btw I'm still facing some issues with the generated GIF. It's not playing and encoding appears to fail.

GIF generated in command lines with ffmpeg (left) . . . GIF generated programmatically (right)
Generated in command line with ffmpeg enter image description here

It looks like some options are not defined... also may be a wrong conversion between QImage and AVFrame? I updated the code above. It represents a lot of code, so I tried to stay short. Don't hesitate to ask more details.

End of EDIT

I’m not really familiar with ffmpeg, any kind of help would be highly appreciated. Thank you.

like image 333
Sierra Avatar asked Jun 16 '16 14:06

Sierra


2 Answers

GIF can only support 256 colors bitmaps (8 bits per pixel). This may be the reason why you have the Specified pixel format rgb24 is invalid or not supported error.

The pixel format you need to use is AV_PIX_FMT_PAL8 (8 bit with RGB32 palette).

like image 82
Gwen Avatar answered Oct 03 '22 01:10

Gwen


Here is a way to convert QImage into GIF format, using ffmpeg. I tried to be as clear as possible, removing errors catching.

First, initializing ffmpeg :

AVOutputFormat  * outputFormat  = Q_NULLPTR;
AVFormatContext * formatContext = Q_NULLPTR;

avformat_alloc_output_context2(&formatContext, NULL, NULL, filePath.data()); // i.e. filePath="C:/Users/.../qt_temp.Jv7868.mp4" 

// Adding the video streams using the default format codecs and initializing the codecs...
outputFormat = formatContext->oformat;
if (outputFormat->video_codec != AV_CODEC_ID_NONE) {
    // Finding a registered encoder with a matching codec ID...
    *codec = avcodec_find_encoder(outputFormat->video_codec);

    // Adding a new stream to a media file...
    stream = avformat_new_stream(formatContext, *codec);
    stream->id = formatContext->nb_streams - 1;


    AVCodecContext * codecContext = avcodec_alloc_context3(*codec);

    switch ((*codec)->type) {
    case AVMEDIA_TYPE_VIDEO:
        codecContext->codec_id  = outputFormat->video_codec; // here, outputFormat->video_codec should be AV_CODEC_ID_GIF
        codecContext->bit_rate  = 400000;

        codecContext->width     = 1240;
        codecContext->height    = 874;

        codecContext->pix_fmt   = AV_PIX_FMT_RGB8;

        ...

        // Timebase: this is the fundamental unit of time (in seconds) in terms of which frame
        // timestamps are represented. For fixed-fps content, timebase should be 1/framerate
        // and timestamp increments should be identical to 1.
        stream->time_base       = (AVRational){1, fps}; // i.e. fps=1
        codecContext->time_base = stream->time_base;

        // Emit 1 intra frame every 12 frames at most
        codecContext->gop_size  = 12;
        codecContext->pix_fmt   = AV_PIX_FMT_YUV420P;

        if (codecContext->codec_id == AV_CODEC_ID_H264) {
            av_opt_set(codecContext->priv_data, "preset", "slow", 0);
        }
        break;
    }

    if (formatContext->oformat->flags & AVFMT_GLOBALHEADER) {
        codecContext->flags |= CODEC_FLAG_GLOBAL_HEADER;
    }
}

avcodec_open2(codecContext, codec, NULL);

// Here we need 3 frames. Basically, the QImage is firstly extracted in AV_PIX_FMT_BGRA.
// We need then to convert it to AV_PIX_FMT_RGB8 which is required by the .gif format.
// If we do that directly, there will be some artefacts and bad effects... to prevent that
// we convert FIRST AV_PIX_FMT_BGRA into AV_PIX_FMT_YUV420P THEN into AV_PIX_FMT_RGB8.
frame = allocPicture(codecContext->width, codecContext->height, codecContext->pix_fmt); // here, codecContext->pix_fmt should be AV_PIX_FMT_RGB8
tmpFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_BGRA);
yuvFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_YUV420P);

avcodec_parameters_from_context(stream->codecpar, codecContext);

av_dump_format(formatContext, 0, filePath.data(), 1);

if (!(outputFormat->flags & AVFMT_NOFILE)) {
    avio_open(&formatContext->pb, filePath.data(), AVIO_FLAG_WRITE);
}

// Writing the stream header, if any...
avformat_write_header(formatContext, NULL);

Then the main part, adding a QImage (received from a loop for example):

// -> parameter: QImage image
const qint32 width  = image.width();
const qint32 height = image.height();

// When we pass a frame to the encoder, it may keep a reference to it internally;
// make sure we do not overwrite it here!
av_frame_make_writable(tmpFrame);

// Converting QImage to AV_PIX_FMT_BGRA AVFrame ...
for (qint32 y = 0; y < height(); y++) {
    const uint8_t * scanline = image.scanLine(y);

    for (qint32 x = 0; x < width() * 4; x++) {
        tmpFrame->data[0][y * tmpFrame->linesize[0] + x] = scanline[x];
    }
}

// Make sure to clear the frame. It prevents a bug that displays only the
// first captured frame on the GIF export.
if (frame) {
    av_frame_free(&frame);
    frame = Q_NULLPTR;
}
frame = allocPicture(codecContext->width, codecContext->height, codecContext->pix_fmt);

if (yuvFrame) {
    av_frame_free(&yuvFrame);
    yuvFrame = Q_NULLPTR;
}
yuvFrame = allocPicture(codecContext->width, codecContext->height, AV_PIX_FMT_YUV420P);

// Converting BGRA -> YUV420P...
if (!swsCtx) {
    swsCtx = sws_getContext(width, height,
                            AV_PIX_FMT_BGRA,
                            width, height,
                            AV_PIX_FMT_YUV420P,
                            swsFlags, NULL, NULL, NULL);
}

// ...then converting YUV420P -> RGB8 (natif GIF format pixel)
if (!swsGIFCtx) {
    swsGIFCtx = sws_getContext(width, height,
                                AV_PIX_FMT_YUV420P,
                                codecContext->width, codecContext->height,
                                codecContext->pix_fmt,
                                this->swsFlags, NULL, NULL, NULL);
}

// This double scaling prevent some artifacts on the GIF and improve
// significantly the display quality
sws_scale(swsCtx,
          (const uint8_t * const *)tmpFrame->data,
          tmpFrame->linesize,
          0,
          codecContext->height,
          yuvFrame->data,
          yuvFrame->linesize);
sws_scale(swsGIFCtx,
          (const uint8_t * const *)yuvFrame->data,
          yuvFrame->linesize,
          0,
          codecContext->height,
          frame->data,
          frame->linesize);

...

AVPacket packet;
int gotPacket = 0;

av_init_packet(&packet);

// Packet data will be allocated by the encoder
packet.data = NULL;
packet.size = 0;

frame->pts = nextPts++; // nextPts starts at 0
avcodec_encode_video2(codecContext, &packet, frame, &gotPacket);

if (gotPacket) {
    // Rescale output packet timestamp values from codec to stream timebase
    av_packet_rescale_ts(packet, *codecContext->time_base, stream->time_base);
    packet->stream_index = stream->index;

    // Write the compressed frame to the media file.
    av_interleaved_write_frame(formatContext, packet);

    av_packet_unref(&this->packet);
}

Closing ffmpeg:

// Retrieving delayed frames if any...
// Note: mainly used for video generation, it might be useless for .gif.
for (int gotOutput = 1; gotOutput;) {
    avcodec_encode_video2(codecContext, &packet, NULL, &gotOutput);

    if (gotOutput) {
        // Rescale output packet timestamp values from codec to stream timebase
        av_packet_rescale_ts(packet, *codecContext->time_base, stream->time_base);
        packet->stream_index = stream->index;

        // Write the compressed frame to the media file.
        av_interleaved_write_frame(formatContext, packet);
        av_packet_unref(&packet);
    }
}

av_write_trailer(formatContext);

avcodec_free_context(&codecContext);
av_frame_free(&frame);
av_frame_free(&tmpFrame);
sws_freeContext(swsCtx);

if (!(outputFormat->flags & AVFMT_NOFILE)) {
    // Closing the output file...
    avio_closep(&formatContext->pb);
}

avformat_free_context(formatContext);

I don’t think it is the easiest way, but at least it worked for me. I let the question open. Please, feel free to comment/improve/answer this.

like image 21
Sierra Avatar answered Oct 03 '22 01:10

Sierra