7

I'm looking to aggregate live representations of all windows. Much like Mission Control (Exposé), I want to extremely quickly access the image buffer of any given NSWindow or screen. Ideally, I want to composite these live images in my own OpenGL context so I can manipulate them (scale and move the windows screen captures around).

Things that are too slow:

  • CGDisplayCreateImage
  • CGWindowListCreateImage
  • CGDisplayIDToOpenGLDisplayMask & CGLCreateContext & CGBitmapContextCreate

Any other ideas? I'm trying to achieve 60 fps capture/composite/output but the best I can get with any of these methods is ~5 fps (on a retina display capturing the entire screen).

4

1 回答 1

6

不幸的是,我还没有找到快速捕获单个窗口的帧缓冲区的方法,但我想出了下一个最好的方法。这是一种将整个屏幕的实时视图快速捕获到 OpenGL 中的方法:

AVFoundation 设置

_session = [[AVCaptureSession alloc] init];
_session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureScreenInput *input = [[AVCaptureScreenInput alloc] initWithDisplayID:kCGDirectMainDisplay];
input.minFrameDuration = CMTimeMake(1, 60);
[_session addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[output setAlwaysDiscardsLateVideoFrames:YES];
[output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:output];
[_session startRunning];

在每一AVCaptureVideoDataOutput帧上

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
  const size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
  const size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);

  CVOpenGLTextureRef texture;
  CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, &texture);
  CVOpenGLTextureCacheFlush(_textureCache, 0);

  // Manipulate and draw the texture however you want...
  const GLenum target = CVOpenGLTextureGetTarget(texture);
  const GLuint name = CVOpenGLTextureGetName(texture);

  // ...

  glEnable(target);
  glBindTexture(target, name);

  CVOpenGLTextureRelease(texture);
}

清理

[_session stopRunning];
CVOpenGLTextureCacheRelease(_textureCache);

AVCaptureVideoDataOutput将图像作为纹理输入 OpenGL 的其他一些实现之间的最大区别在于它们可能使用CVPixelBufferLockBaseAddressCVPixelBufferGetBaseAddressglTexImage2DCVPixelBufferUnlockBaseAddress. 这种方法的问题是它通常非常冗余和缓慢。CVPixelBufferLockBaseAddress将确保它即将交给您的内存不是 GPU 内存,并将其全部复制到通用 CPU 内存。这是不好的!毕竟,我们只是用glTexImage2D.

因此,我们可以CVPixelBuffer利用CVOpenGLTextureCacheCreateTextureFromImage.

我希望这对其他人有所帮助……该CVOpenGLTextureCache套件的文档记录非常好,而它的 iOS 对应版本CVOpenGLESTextureCache的文档记录稍好一些。

在 20% CPU 下以 60fps 捕获 2560x1600 桌面!

于 2014-03-28T22:57:18.667 回答