我正在使用 OpenGL ES 2.0 编写一个 iOS 视频应用程序来进行图像处理。
我的视频输入和输出格式是 YUV 4:2:0,这是 iPhone 3GS 之后大多数设备的原生像素格式。对于 A5 及更高的处理器,我只需创建一个亮度纹理和一个色度纹理,然后将它们附加到屏幕外帧缓冲区。我创建我的纹理如下:
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RED_EXT,
(int)CVPixelBufferGetWidthOfPlane(pixelBuffer, 0),
(int)CVPixelBufferGetHeightOfPlane(pixelBuffer, 0),
GL_RED_EXT,
GL_UNSIGNED_BYTE,
0,
&lumaTexture);
然后我将它附加到程序中,例如:
glActiveTexture([self getTextureUnit:textureUnit]);
glBindTexture(CVOpenGLESTextureGetTarget(texture), CVOpenGLESTextureGetName(texture));
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
if(uniform != -1)
{
glUniform1i(uniforms[uniform], textureUnit);
}
在我的着色器中,我可以简单地做:
gl_FragColor.r = texture2D(SamplerY, textureRead).r;
将亮度值分配给缓冲区并将生成的视频帧保存到磁盘。
不幸的是,我遇到了 iPhone 4 的问题,因为它没有使用 A5 处理器,因此GL_RED_EXT
不受支持。
I have then tried to figure out a way to write to a 1 channel luma buffer in OpenGL ES, but keep running into problems. I tried simply changing the GL_RED_EXT
to GL_LUMINANCE
but found out that it isn't possible to write to GL_LUMINANCE
.
I then tried registering a color attachment and a depth attachment as:
GLuint colorRenderbuffer;
glGenRenderbuffers(1, &colorRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, colorRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_RGB8_OES, (int)CVPixelBufferGetWidthOfPlane(renderData.destinationPixelBuffer, 0), (int)CVPixelBufferGetHeightOfPlane(renderData.destinationPixelBuffer, 0));
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER, colorRenderbuffer);
GLuint depthRenderbuffer;
glGenRenderbuffers(1, &depthRenderbuffer);
glBindRenderbuffer(GL_RENDERBUFFER, depthRenderbuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, (int)CVPixelBufferGetWidthOfPlane(renderData.destinationPixelBuffer, 0), (int)CVPixelBufferGetHeightOfPlane(renderData.destinationPixelBuffer, 0));
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,
GL_RENDERBUFFER, depthRenderbuffer);
Writing to the depth buffer in my fragment shader:
gl_FragDepth.z = texture2D(SamplerY, textureRead).r;
And then writing the result to the pixel buffer as:
glReadPixels(0, 0, (int)CVPixelBufferGetWidthOfPlane(renderData.destinationPixelBuffer, 0), (int)CVPixelBufferGetHeightOfPlane(renderData.destinationPixelBuffer, 0), GL_LUMINANCE, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddressOfPlane(renderData.destinationPixelBuffer, 0));
But again I read in the specs that OpenGL ES 2.0 does not support writing directly to the depth buffer.
So I am left with no obvious way to create a single channel color attachment and I am not sure how I could write to a RGB color attachment and only copy one channel to my pixel buffer.
Sorry for the long post, just wanted to give as much information as possible.
Any ideas?