Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is there a way to be get the brightness level on iOS of a stream of the camera?

I am using the iPhone/iPad camera to get a video stream and doing recognition on the stream, but with lighting changes it has a negative impact on the robustness. I have tested different settings in different light and can get it to work, but trying to get the settings to adjust at run time is what I need.

I can calculate a simple brightness check on each frame, but the camera adjusts and throws my results off. I can watch for sharp changes and run checks then, but gradual changes would throw my results off as well.

Ideally I'd be like to access the camera/EXIF data for the stream and see what it is registering the unfiltered brightness as, is there a way to do this?

(I am working for devices iOS 5 and above)

Thank you

like image 537
Hugo Scott-Slade Avatar asked Dec 04 '12 17:12

Hugo Scott-Slade


People also ask

How do I adjust the brightness on my iPhone camera?

One way to adjust exposure is to slide your finger up and down on the sun icon next to the yellow box. Sliding up will make the photo lighter, sliding down will make it darker. Note: This yellow slider may appear faintly on your screen; you may have to look very closely to see it.

Why are movies so dark on my iPad?

to turn Dark Mode on or off. Go to Settings > Display & Brightness, then select Dark to turn on Dark Mode or select Light to turn it off.

Does iPhone automatically adjust brightness?

The auto-brightness feature is on by default. When auto-brightness is on, you'll notice that the brightness slider on your device moves according to changing light conditions. You can turn auto-brightness on or off in Settings > Accessibility > Display & Text Size.


2 Answers

Available in iOS 4.0 and above. It's possible to get EXIF information from CMSampleBufferRef.

//Import ImageIO & include framework in your project. 
#import <ImageIO/CGImageProperties.h>

In your sample buffer delegate toll-free bridging will get a NSDictionary of results from CoreMedia's CMGetAttachment.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    NSDictionary* dict = (NSDictionary*)CMGetAttachment(sampleBuffer, kCGImagePropertyExifDictionary, NULL);
like image 137
markturnip Avatar answered Nov 15 '22 09:11

markturnip


Complete code, as used in my own app:

- (void)setupAVCapture {

//-- Setup Capture Session.
_session = [[AVCaptureSession alloc] init];
[_session beginConfiguration];

//-- Set preset session size.
[_session setSessionPreset:AVCaptureSessionPreset1920x1080];

//-- Creata a video device and input from that Device.  Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
    assert(0);

//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error)
    assert(0);

[_session addInput:input];

//-- Create the output for the capture session.
AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES]; // Probably want to set this to NO when recording

//-- Set to YUV420.
[dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
                                                         forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; // Necessary for manual preview

// Set dispatch to be on the main thread so OpenGL can do things with the data
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

[_session addOutput:dataOutput];
[_session commitConfiguration];

[_session startRunning];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{
    CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
                                                                 sampleBuffer, kCMAttachmentMode_ShouldPropagate);
    NSDictionary *metadata = [[NSMutableDictionary alloc]
                              initWithDictionary:(__bridge NSDictionary*)metadataDict];
    CFRelease(metadataDict);
    NSDictionary *exifMetadata = [[metadata
                                   objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
    self.autoBrightness = [[exifMetadata
                         objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];

    float oldMin = -4.639957; // dark
    float oldMax = 4.639957; // light
    if (self.autoBrightness > oldMax) oldMax = self.autoBrightness; // adjust oldMax if brighter than expected oldMax

    self.lumaThreshold = ((self.autoBrightness - oldMin) * ((3.0 - 1.0) / (oldMax - oldMin))) + 1.0;

    NSLog(@"brightnessValue %f", self.autoBrightness);
    NSLog(@"lumaThreshold %f", self.lumaThreshold);
}

The lumaThreshold variable is sent as a uniform variable to my fragment shader, which multiplies the Y sampler texture to find the ideal luminosity based on the brightness of the environment. Right now, it uses the back camera; I'll probably switch to the front camera, since I'm only changing the "brightness" of the screen to adjust for indoor/outdoor viewing, and the user's eyes are on the front of the camera (and not the back).

like image 28
James Bush Avatar answered Nov 15 '22 10:11

James Bush