Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

FFmpeg: How to get AVPacket from OSX CMSampleBufferRef

I'm new in AV technology and have been trying to combine FFmpeg with Apple's CoreVideo framework to process webcam captures.

First, i have webcam captures from CoreVideo(which could be found from AVCaptureVideoDataOutputSampleBufferDelegate) represented by CMSampleBuffer

- (void)captureOutput:(AVCaptureOutput *)captureOutput
  didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
  fromConnection:(AVCaptureConnection *)connection {}

From that, without store as temporary file, I would like to make it to FFmpeg's AVPacket so I can process.

Does anyone know which FFmpeg api I should be looking into?

like image 206
wao813 Avatar asked Mar 11 '26 16:03

wao813


1 Answers

Assuming that you have access to the buffer raw data, you first need to create an AVPicture then fill it with the raw data, then encode the frame.

you might also need to check the frame's pixel format (i.e YUV442, YUV420, ARGB, ...)

int result;
AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_H264);
AVCodecContext *codec_context = avcodec_alloc_context3(codec);

uint8_t *frame_bytes; //this should hold the frame raw data
AVFrame *frm = av_frame_alloc();    
result = avpicture_fill((AVPicture*) frm, frame_bytes, AV_PIX_FMT_YUV410P, frame_width, frame_height);
assert(!result);

AVPacket pkt;
int got_packet;

av_init_packet(&pkt);
result = avcodec_encode_video2(codec_context, &pkt, frm, &got_packet);
assert(!result);
like image 157
Mohamed El-Sayed Avatar answered Mar 14 '26 05:03

Mohamed El-Sayed