I have an AVPlayerLayer which I would like to create an OpenGL Texture out of. I'm comfortable with opengl textures, and even comfortable with converting a CGImageRef into an opengl texture. It seems to me the code below should work, but I get just plain black. What am I doing wrong? Do I need to set any properties on the CALayer / AVPlayerLayer first?
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int width = (int)[layer bounds].size.width;
int height = (int)[layer bounds].size.height;
CGContextRef context = CGBitmapContextCreate(NULL,
width,
height,
8,
width * 4,
colorSpace,
kCGImageAlphaPremultipliedLast);
CGColorSpaceRelease(colorSpace);
if (context== NULL) {
ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 1");
return;
}
[[layer presentationLayer] renderInContext:context];
CGImageRef cgImage = CGBitmapContextCreateImage(context);
int bytesPerPixel = CGImageGetBitsPerPixel(cgImage)/8;
if(bytesPerPixel == 3) bytesPerPixel = 4;
GLubyte *pixels = (GLubyte *) malloc(width * height * bytesPerPixel);
CGContextRelease(context);
context = CGBitmapContextCreate(pixels,
width,
height,
CGImageGetBitsPerComponent(cgImage),
width * bytesPerPixel,
CGImageGetColorSpace(cgImage),
kCGImageAlphaPremultipliedLast);
if(context == NULL) {
ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 2");
free(pixels);
return;
}
CGContextDrawImage(context, CGRectMake(0.0, 0.0, width, height), cgImage);
int glMode;
switch(bytesPerPixel) {
case 1:
glMode = GL_LUMINANCE;
break;
case 3:
glMode = GL_RGB;
break;
case 4:
default:
glMode = GL_RGBA; break;
}
if(texture.bAllocated() == false || texture.getWidth() != width || texture.getHeight() != height) {
NSLog(@"getTextureFromLayer: allocating texture %i, %i\n", width, height);
texture.allocate(width, height, glMode, true);
}
// test texture
// for(int i=0; i<width*height*4; i++) pixels[i] = ofRandomuf() * 255;
texture.loadData(pixels, width, height, glMode);
CGContextRelease(context);
CFRelease(cgImage);
free(pixels);
P.S. The variable 'texture' is a C++ opengl (-es compatible) texture object which I know works. If I uncomment the 'test texture' for-loop filling the texture with random noise, I can see that, so problem is definitely before.
In response to Nick Weaver's reply I tried a different approach, and now I'm always getting NULL back from copyNextSampleBuffer with status == 3 (AVAssetReaderStatusFailed). Am I missing something?
AVPlayer *videoPlayer;
AVPlayerLayer *videoLayer;
AVAssetReader *videoReader;
AVAssetReaderTrackOutput*videoOutput;
videoPlayer = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:[NSString stringWithUTF8String:videoPath.c_str()]]];
if(videoPlayer == nil) {
NSLog(@"videoPlayer == nil ERROR LOADING %s\n", videoPath.c_str());
} else {
NSLog(@"videoPlayer: %@", videoPlayer);
videoLayer = [[AVPlayerLayer playerLayerWithPlayer:videoPlayer] retain];
videoLayer.frame = [ThreeDView instance].bounds;
// [[ThreeDView instance].layer addSublayer:videoLayer]; // test to see if it's loading and running
AVAsset *asset = videoPlayer.currentItem.asset;
NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], (NSString*)kCVPixelBufferPixelFormatTypeKey, nil];
videoReader = [[AVAssetReader alloc] initWithAsset:asset error:nil];
videoOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:[tracks objectAtIndex:0] outputSettings:settings];
[videoReader addOutput:videoOutput];
[videoReader startReading];
}
if(videoPlayer == 0) {
ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoPlayer == 0");
return;
}
if(videoOutput == 0) {
ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoOutput == 0");
return;
}
CMSampleBufferRef sampleBuffer = [videoOutput copyNextSampleBuffer];
if(sampleBuffer == 0) {
ofLog(OF_LOG_ERROR, "Shot::drawVideo: sampleBuffer == 0, status: %i", videoReader.status);
return;
}
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CFRelease(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
unsigned char *pixels = ( unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer);
int width = CVPixelBufferGetWidth(imageBuffer);
int height = CVPixelBufferGetHeight(imageBuffer);
if(videoTexture.bAllocated() == false || videoTexture.getWidth() != width || videoTexture.getHeight() != height) {
NSLog(@"Shot::drawVideo() allocating texture %i, %i\n", width, height);
videoTexture.allocate(width, height, GL_RGBA, true);
}
videoTexture.loadData(pixels, width, height, GL_BGRA);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
I think iOS4: how do I use video file as an OpenGL texture? will be helpful for your question.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With