On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.
Is such a thing possible on iOS?
On Android, it is possible to make the camera write its output directly to an OpenGL texture (of type GL_TEXTURE_EXTERNAL_OES), avoiding buffers on the CPU altogether.
Is such a thing possible on iOS?
你从 iOS 中的相机得到的输出是一个CMSampleBufferRef
,CVPixelBufferRef
里面有一个。(请参阅此处的文档)。版本 5CVOpenGLESTextureCache
的 iOS 具有 CoreVideo 框架,它允许您使用 a 创建 OpenGL ES 纹理CVPixelBufferRef
,避免任何副本。
在 Apple 的开发者网站上查看RosyWriter示例,它就在那里。