2

When looking at the GLCameraRipple example, the AVCaptureVideoDataOutput is setup in such a way that a callback is called (captureOutput) whenever a new frame arrives from the iphone camera.

However, putting a "sleep(1)" at the beginning of the "drawInRect" function (that is used for OpenGL drawing), this callback gets called only 1 time per second, instead of 30 times per second.

Can anyone tell me why the framerate of the iphone camera is linked with the framerate of the OpenGL draw call?


Update: Steps to reproduce

4

1 回答 1

2

当 AVCaptureVideoDataOutput 调用委托方法 captureOutput:didOutputSampleBuffer:fromConnection: 以使程序员能够编辑或记录来自相机的图像时,该方法从主线程调用。而且,通常我们应该编写直接通过主线程与用户界面交互的代码,这就是为什么 OpenGL 喜欢使用 AVCaptureVideoDataOutput 因为从相机和绘图到屏幕的方法都在主线程中运行。

如果 iPhone 无法处理 captureOutput:didOutputSampleBuffer:fromConnection: 和 AVCaptureVideoDataOutput 类可以丢弃图像,如处理时间超过 1/30 秒下一帧将被忽略,您可以使用 captureOutput: didDropSampleBuffer: fromConnection: 方法收集数据

于 2013-07-01T10:25:49.073 回答