不幸的是,我还没有找到快速捕获单个窗口的帧缓冲区的方法,但我想出了下一个最好的方法。这是一种将整个屏幕的实时视图快速捕获到 OpenGL 中的方法:
AVFoundation 设置
_session = [[AVCaptureSession alloc] init];
_session.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureScreenInput *input = [[AVCaptureScreenInput alloc] initWithDisplayID:kCGDirectMainDisplay];
input.minFrameDuration = CMTimeMake(1, 60);
[_session addInput:input];
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
[output setAlwaysDiscardsLateVideoFrames:YES];
[output setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:output];
[_session startRunning];
在每一AVCaptureVideoDataOutput
帧上
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
const size_t bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
const size_t bufferHeight = CVPixelBufferGetHeight(pixelBuffer);
CVOpenGLTextureRef texture;
CVOpenGLTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, &texture);
CVOpenGLTextureCacheFlush(_textureCache, 0);
// Manipulate and draw the texture however you want...
const GLenum target = CVOpenGLTextureGetTarget(texture);
const GLuint name = CVOpenGLTextureGetName(texture);
// ...
glEnable(target);
glBindTexture(target, name);
CVOpenGLTextureRelease(texture);
}
清理
[_session stopRunning];
CVOpenGLTextureCacheRelease(_textureCache);
AVCaptureVideoDataOutput
将图像作为纹理输入 OpenGL 的其他一些实现之间的最大区别在于它们可能使用CVPixelBufferLockBaseAddress
、CVPixelBufferGetBaseAddress
、glTexImage2D
和CVPixelBufferUnlockBaseAddress
. 这种方法的问题是它通常非常冗余和缓慢。CVPixelBufferLockBaseAddress
将确保它即将交给您的内存不是 GPU 内存,并将其全部复制到通用 CPU 内存。这是不好的!毕竟,我们只是用glTexImage2D
.
因此,我们可以CVPixelBuffer
利用CVOpenGLTextureCacheCreateTextureFromImage
.
我希望这对其他人有所帮助……该CVOpenGLTextureCache
套件的文档记录非常好,而它的 iOS 对应版本CVOpenGLESTextureCache
的文档记录稍好一些。
在 20% CPU 下以 60fps 捕获 2560x1600 桌面!