I'm new in AV technology and have been trying to combine FFmpeg with Apple's CoreVideo framework to process webcam captures.
First, i have webcam captures from CoreVideo(which could be found from AVCaptureVideoDataOutputSampleBufferDelegate) represented by CMSampleBuffer
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {}
From that, without store as temporary file, I would like to make it to FFmpeg's AVPacket so I can process.
Does anyone know which FFmpeg api I should be looking into?
Assuming that you have access to the buffer raw data, you first need to create an AVPicture then fill it with the raw data, then encode the frame.
you might also need to check the frame's pixel format (i.e YUV442, YUV420, ARGB, ...)
int result;
AVCodec *codec = avcodec_find_encoder(AV_CODEC_ID_H264);
AVCodecContext *codec_context = avcodec_alloc_context3(codec);
uint8_t *frame_bytes; //this should hold the frame raw data
AVFrame *frm = av_frame_alloc();
result = avpicture_fill((AVPicture*) frm, frame_bytes, AV_PIX_FMT_YUV410P, frame_width, frame_height);
assert(!result);
AVPacket pkt;
int got_packet;
av_init_packet(&pkt);
result = avcodec_encode_video2(codec_context, &pkt, frm, &got_packet);
assert(!result);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With