Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Encode buffer captured by OpenGL in C

I am trying to use OpenGL to capture the back buffer of my computer's screen, and then H.264 encode the buffer using FFMPEG's libavcodec library. The issue I'm having is that I would like to encode the video in AV_PIX_FMT_420P, but the back buffer capture function provided by OpenGL, glReadPixels(), only supports formats like GL_RGB. As you can see below, I try to use FFMPEG's swscale() function to convert from RGB to YUV, but the following code crashes at the swscale() line. Any ideas on how I can encode the OpenGL backbuffer?

// CAPTURE BACK BUFFER USING OPENGL
    int width = 1280, height = 720;
    BYTE* pixels = (BYTE *) malloc(sizeof(BYTE));
    glReadPixels(0, 720, width, height, GL_RGB, GL_UNSIGNED_BYTE, pixels);

//CREATE FFMPEG VARIABLES
    avcodec_register_all();

    AVCodec *codec;
    AVCodecContext *context;
    struct SwsContext *sws;
    AVPacket packet;
    AVFrame *frame;

    codec = avcodec_find_encoder(AV_CODEC_ID_H264);
    context = avcodec_alloc_context3(encoder->codec);
    context->dct_algo = FF_DCT_FASTINT;
    context->bit_rate = 400000;
    context->width = width;
    context->height = height;
    context->time_base.num = 1;
    context->time_base.den = 30;
    context->gop_size = 1;
    context->max_b_frames = 1;
    context->pix_fmt = AV_PIX_FMT_YUV420P;

    avcodec_open2(context, codec, NULL);

// CONVERT TO YUV AND ENCODE
    int frame_size = avpicture_get_size(AV_PIX_FMT_YUV420P, out_width, out_height);
    encoder->frame_buffer = malloc(frame_size);
    avpicture_fill((AVPicture *) encoder->frame, (uint8_t *) encoder->frame_buffer, AV_PIX_FMT_YUV420P, out_width, out_height);
    sws = sws_getContext(in_width, in_height, AV_PIX_FMT_RGB32, out_width, out_height, AV_PIX_FMT_YUV420P, SWS_FAST_BILINEAR, 0, 0, 0);

    uint8_t *in_data[1] = {(uint8_t *) pixels};
    int in_linesize[1] = {width * 4};


// PROGRAM CRASHES HERE


    sws_scale(encoder->sws, in_data, in_linesize, 0, encoder->in_height, encoder->frame->data, encoder->frame->linesize);

    av_free_packet(&packet);
    av_init_packet(&packet);
    int success;

    avcodec_encode_video2(context, &packet, frame, &success);
like image 756
M. Ying Avatar asked Jan 04 '20 00:01

M. Ying


Video Answer


1 Answers

Your pixels buffer is too small; you malloc only one BYTE instead of width*height*4 bytes:

BYTE* pixels = (BYTE *) malloc(width*height*4);

Your glReadPixels call is also incorrect:

  • Passing y=720 causes it to read outside the window. Remember that OpenGL coordinate system has the y-axis pointing upwards.
  • AV_PIX_FMT_RGB32 expects four bytes per pixel, whereas GL_RGB writes three bytes per pixel, therefore you need GL_RGBA or GL_BGRA.
  • Of the two I'm pretty sure that it should be GL_BGRA: AV_PIX_FMT_RGB32 treats pixels as 32-bit integers, therefore on little-endian blue comes first. OpenGL treats each channel as a byte, therefore it should be GL_BGRA to match.

To summarize try:

glReadPixels(0, 0, width, height, GL_BGRA, GL_UNSIGNED_BYTE, pixels);

Additionally, due to OpenGL y-axis pointing upwards but ffmpeg y-axis pointing downwards you may need to flip the image. It can be done with the following trick:

uint8_t *in_data[1] = {(uint8_t *) pixels + (height-1)*width*4}; // address of the last line
int in_linesize[1] = {- width * 4}; // negative stride
like image 197
Yakov Galka Avatar answered Sep 25 '22 17:09

Yakov Galka