Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Capture all NSWindows as active images like Mission Control in Mac OS X

I'm looking to aggregate live representations of all windows. Much like Mission Control (Exposé), I want to extremely quickly access the image buffer of any given NSWindow or screen. Ideally, I want to composite these live images in my own OpenGL context so I can manipulate them (scale and move the windows screen captures around).

Things that are too slow:

  • CGDisplayCreateImage
  • CGWindowListCreateImage
  • CGDisplayIDToOpenGLDisplayMask & CGLCreateContext & CGBitmapContextCreate

Any other ideas? I'm trying to achieve 60 fps capture/composite/output but the best I can get with any of these methods is ~5 fps (on a retina display capturing the entire screen).

like image 941
Skyler Avatar asked Mar 25 '14 01:03

Skyler


People also ask

How do I show all active windows on Mac?

Show or move all open windows Show all open windows for the current app: Press Control-Down Arrow. If App Exposé is enabled in Trackpad settings, you can also swipe down with three or four fingers. To return to the desktop, press the keys again or swipe up.

How do you Cascade All windows on a Mac?

Apparently if you hold the "Option" key wile selecting "Arrange In Front" from the Window menu of your application, that should do what you want.

What is the equivalent of windows control panels on a Mac?

A Mac's Control Panel Is System Preferences While Windows calls its configuration options “settings,” macOS usually calls them “preferences.” Before you change any preferences, you'll have to launch the System Preferences app. On every new Mac, you should be able to find System Preferences in the Dock by default.


1 Answers

Unfortunately, I haven't found away to quickly capture the framebuffers of individual windows, but I figured out the next best thing. This is a method for quickly capturing the live view of the entire screen(s) into OpenGL:

AVFoundation Setup

_session = [[AVCaptureSession alloc] init];
_session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureScreenInput *input = [[AVCaptureScreenInput alloc] initWithDisplayID:kCGDirectMainDisplay];
input.minFrameDuration = CMTimeMake(1, 60);
[_session addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[output setAlwaysDiscardsLateVideoFrames:YES];
[output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:output];
[_session startRunning];

On Each AVCaptureVideoDataOutput Frame

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
  const size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
  const size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);

  CVOpenGLTextureRef texture;
  CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, &texture);
  CVOpenGLTextureCacheFlush(_textureCache, 0);

  // Manipulate and draw the texture however you want...
  const GLenum target = CVOpenGLTextureGetTarget(texture);
  const GLuint name = CVOpenGLTextureGetName(texture);

  // ...

  glEnable(target);
  glBindTexture(target, name);

  CVOpenGLTextureRelease(texture);
}

Cleanup

[_session stopRunning];
CVOpenGLTextureCacheRelease(_textureCache);

The big difference here between some other implementations that get the AVCaptureVideoDataOutput image into OpenGL as a texture is that they might use CVPixelBufferLockBaseAddress, CVPixelBufferGetBaseAddress, glTexImage2D, and CVPixelBufferUnlockBaseAddress. The issue with this approach is that it's typically terribly redundant and slow. CVPixelBufferLockBaseAddress will make sure that the memory it's about to hand you is not GPU memory, and will copy it all to general purpose CPU memory. This is bad! After all, we'd just be copying it back to the GPU with glTexImage2D.

So, we can take advantage of the fact that the CVPixelBuffer is already in GPU memory with CVOpenGLTextureCacheCreateTextureFromImage.

I hope this helps someone else... the CVOpenGLTextureCache suite is terribly documented and its iOS counterpart CVOpenGLESTextureCache is only slightly better documented.

60fps at 20% CPU capturing the 2560x1600 desktop!

like image 78
Skyler Avatar answered Sep 29 '22 03:09

Skyler