Eventually, I'm looking to create a shader that can convert the video to black and white for me (And then applying some other effects that I'm not sure if I should disclose), simply because doing this on the CPU get's me around 1 frame a second.
Anyway, for right now, I simply wish to display the video frames to the screen. I can draw triangles to the screen, so I know that my OpenGL view is working correctly, and I'm getting NSLogs from the
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
method. This method is where I'm attempting to do all the drawing. Unfortunately, I'm doing something wrong... and the camera frame isn't drawing.
Here's my simple vertex shader:
attribute vec4 position;
attribute vec4 inputTextureCoordinate;
varying vec2 textureCoordinate;
void main()
{
gl_Position = position;
textureCoordinate = inputTextureCoordinate.xy;
}
(I know that it's being compiled, and is working again, because of the primitives I'm able to render.)
Here's my simplistic fragment shader:
varying highp vec2 textureCoordinate;
uniform sampler2D videoFrame;
void main()
{
gl_FragColor = texture2D(videoFrame, textureCoordinate);
}
And.... here's where I'm attempting to put it all together, haha:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
//NSLog(@"Frame...");
CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(cameraFrame, 0);
int bufferHeight = CVPixelBufferGetHeight(cameraFrame);
int bufferWidth = CVPixelBufferGetWidth(cameraFrame);
//these er, have to be set
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
//set the image for the currently bound texture
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, bufferWidth, bufferHeight, 0, GL_BGRA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(cameraFrame));
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
1.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
0.0f, 0.0f,
};
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, videoFrameTexture);
// Update uniform values
glUniform1i(videoFrameUniform, 0);
glVertexAttribPointer(0, 2, GL_FLOAT, 0, 0, squareVertices);
glEnableVertexAttribArray(0);
glVertexAttribPointer(1, 2, GL_FLOAT, 0, 0, textureVertices);
glEnableVertexAttribArray(1);
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
}
Doesn't work right.
Any help would be extremely appreciated, I'm out of ideas and at a loss here. Thanks in advance!
**Edit : Here's my code I use to set up the view, load OpenGL and start the capture session.
- (id)initWithFrame:(CGRect)frame {
NSLog(@"Yo.");
self = [super initWithFrame:frame];
if (self) {
CAEAGLLayer *eaglLayer = (CAEAGLLayer*)[super layer];
[eaglLayer setOpaque: YES];
[eaglLayer setFrame: [self bounds]];
[eaglLayer setContentsScale: 2.0];
glContext = [[EAGLContext alloc] initWithAPI: kEAGLRenderingAPIOpenGLES2];
if(!glContext || ![EAGLContext setCurrentContext: glContext]) {
[self release];
return nil;
}
//endable 2D textures
glEnable(GL_TEXTURE_2D);
//generates the frame and render buffers at the pointer locations of the frameBuffer and renderBuffer variables
glGenFramebuffers(1, &frameBuffer);
glGenRenderbuffers(1, &renderBuffer);
//binds the frame and render buffers, they can now be modified or consumed by later openGL calls
glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, renderBuffer);
//generate storeage for the renderbuffer (Wouldn't be used for offscreen rendering, glRenderbufferStorage() instead)
[glContext renderbufferStorage:GL_RENDERBUFFER fromDrawable: eaglLayer];
//attaches the renderbuffer to the framebuffer
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, renderBuffer);
//sets up the coordinate system
glViewport(0, 0, frame.size.width, frame.size.height);
//|||||||||||||||--Remove this stuff later--||||||||||||||//
//create the vertex and fragement shaders
GLint vertexShader, fragmentShader;
vertexShader = glCreateShader(GL_VERTEX_SHADER);
fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);
//get their source paths, and the source, store in a char array
NSString *vertexShaderPath = [[NSBundle mainBundle] pathForResource: @"testShader" ofType: @"vsh"];
NSString *fragmentShaderPath = [[NSBundle mainBundle] pathForResource: @"testShader" ofType: @"fsh"];
const GLchar *vertexSource = (GLchar *)[[NSString stringWithContentsOfFile: vertexShaderPath encoding: NSUTF8StringEncoding error: nil] UTF8String];
const GLchar *fragmentSource = (GLchar *)[[NSString stringWithContentsOfFile: fragmentShaderPath encoding: NSUTF8StringEncoding error: nil] UTF8String];
NSLog(@"\n--- Vertex Source ---\n%s\n--- Fragment Source ---\n%s", vertexSource, fragmentSource);
//associate the source strings with each shader
glShaderSource(vertexShader, 1, &vertexSource, NULL);
glShaderSource(fragmentShader, 1, &fragmentSource, NULL);
//compile the vertex shader, check for errors
glCompileShader(vertexShader);
GLint compiled;
glGetShaderiv(vertexShader, GL_COMPILE_STATUS, &compiled);
if(!compiled) {
GLint infoLen = 0;
glGetShaderiv(vertexShader, GL_INFO_LOG_LENGTH, &infoLen);
GLchar *infoLog = (GLchar *)malloc(sizeof(GLchar) * infoLen);
glGetShaderInfoLog(vertexShader, infoLen, NULL, infoLog);
NSLog(@"\n--- Vertex Shader Error ---\n%s", infoLog);
free(infoLog);
}
//compile the fragment shader, check for errors
glCompileShader(fragmentShader);
glGetShaderiv(fragmentShader, GL_COMPILE_STATUS, &compiled);
if(!compiled) {
GLint infoLen = 0;
glGetShaderiv(fragmentShader, GL_INFO_LOG_LENGTH, &infoLen);
GLchar *infoLog = (GLchar *)malloc(sizeof(GLchar) * infoLen);
glGetShaderInfoLog(fragmentShader, infoLen, NULL, infoLog);
NSLog(@"\n--- Fragment Shader Error ---\n%s", infoLog);
free(infoLog);
}
//create a program and attach both shaders
testProgram = glCreateProgram();
glAttachShader(testProgram, vertexShader);
glAttachShader(testProgram, fragmentShader);
//bind some attribute locations...
glBindAttribLocation(testProgram, 0, "position");
glBindAttribLocation(testProgram, 1, "inputTextureCoordinate");
//link and use the program, make sure it worked :P
glLinkProgram(testProgram);
glUseProgram(testProgram);
GLint linked;
glGetProgramiv(testProgram, GL_LINK_STATUS, &linked);
if(!linked) {
GLint infoLen = 0;
glGetProgramiv(testProgram, GL_INFO_LOG_LENGTH, &infoLen);
GLchar *infoLog = (GLchar *)malloc(sizeof(GLchar) * infoLen);
glGetProgramInfoLog(testProgram, infoLen, NULL, infoLog);
NSLog(@"%s", infoLog);
free(infoLog);
}
videoFrameUniform = glGetUniformLocation(testProgram, "videoFrame");
#if(!TARGET_IPHONE_SIMULATOR)
//holding an error
NSError *error = nil;
//create a new capture session, set the preset, create + add the video camera input
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];
AVCaptureDevice *videoCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCamera error:&error];
[captureSession addInput:videoInput];
//set up the data ouput object, tell it to discard late video frames for no lag
AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
dataOutput.alwaysDiscardsLateVideoFrames = YES;
//create a new dispatch queue for all the sample buffers to be called into.grapher
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[dataOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
//set some settings on the video data output
NSString *key = (NSString *)kCVPixelBufferPixelFormatTypeKey;
NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[dataOutput setVideoSettings:videoSettings];
//add the data output
[captureSession addOutput:dataOutput];
//start the capture session running
[captureSession startRunning];
#endif
//|||||||||||||||--Remove this stuff later--||||||||||||||//
//draw the view
[self drawView];
}
return self;
}
-(void)drawView {
//set what color clear is, and then clear the buffer
glClearColor(0.2f, 0.589f, 0.12f, 1);
glClear(GL_COLOR_BUFFER_BIT); //HUH?
[glContext presentRenderbuffer: GL_RENDERBUFFER];
}
This sample application of mine has three shaders that perform various levels of processing and display of camera frames to the screen. I explain how this application works in my post here, as well as in the OpenGL ES 2.0 session of my class on iTunes U.
In fact, the shaders here looks to be direct copies of the ones I used for the direct display of video frames, so you're probably already using that application as a template. I assume that my sample application runs just fine on your device?
If so, then there has to be some difference between the starting sample and your application. You appear to have simplified my sample by pulling some of the code I had in the -drawFrame
method into the end of your delegate method, which should work fine, so that's not the problem. I'll assume that the rest of your OpenGL setup is identical to what I had in that sample, so the scene is configured properly.
Looking through my code and comparing it to what you've posted, all that I can see that is different is a missing glUseProgram()
call in your code. If you've properly compiled and linked the shader program in code away from what you've shown here, you just need to employ glUseProgram()
somewhere before you update the uniform values.
Also, you're binding the renderbuffer, but you may need
[context presentRenderbuffer:GL_RENDERBUFFER];
after your last line there to make sure the contents get to the screen (where context
is your EAGLContext instance).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With