7

我需要在 iOS 上编写 OpenGL ES 二维渲染器。它应该将一些图元(如线条和多边形)绘制到二维图像中(它将渲染矢量图)。在该任务中,哪种方式最适合从 OpenGL 上下文中获取图像?我的意思是,我应该将这些图元渲染成纹理,然后从中获取图像,还是什么?此外,如果有人给出看起来像我需要的东西(2d GL 渲染成图像)的示例或教程,那将是很棒的。提前致谢!

4

1 回答 1

21

如果您需要渲染 OpenGL ES 2-D 场景,然后提取该场景的图像以在 OpenGL ES 之外使用,您有两个主要选择。

第一种是简单地渲染你的场景并使用它glReadPixels()来获取场景的 RGBA 数据并将其放置在一个字节数组中,如下所示:

GLubyte *rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
// Do something with the image
free(rawImagePixels);

第二种更快的方法是将场景渲染到纹理支持的帧缓冲区对象 (FBO),其中纹理由 iOS 5.0 的纹理缓存提供。我在这个答案中描述了这种方法,尽管我没有在那里显示原始数据访问的代码。

您可以执行以下操作来设置纹理缓存并绑定 FBO 纹理:

    CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &rawDataTextureCache);
    if (err) 
    {
        NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
    }

    // Code originally sourced from http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/

    CFDictionaryRef empty; // empty value for attr value.
    CFMutableDictionaryRef attrs;
    empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
                               NULL,
                               NULL,
                               0,
                               &kCFTypeDictionaryKeyCallBacks,
                               &kCFTypeDictionaryValueCallBacks);
    attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                      1,
                                      &kCFTypeDictionaryKeyCallBacks,
                                      &kCFTypeDictionaryValueCallBacks);

    CFDictionarySetValue(attrs,
                         kCVPixelBufferIOSurfacePropertiesKey,
                         empty);

    //CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);

    CVPixelBufferCreate(kCFAllocatorDefault, 
                        (int)imageSize.width, 
                        (int)imageSize.height,
                        kCVPixelFormatType_32BGRA,
                        attrs,
                        &renderTarget);

    CVOpenGLESTextureRef renderTexture;
    CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
                                                  rawDataTextureCache, renderTarget,
                                                  NULL, // texture attributes
                                                  GL_TEXTURE_2D,
                                                  GL_RGBA, // opengl format
                                                  (int)imageSize.width, 
                                                  (int)imageSize.height,
                                                  GL_BGRA, // native iOS format
                                                  GL_UNSIGNED_BYTE,
                                                  0,
                                                  &renderTexture);
    CFRelease(attrs);
    CFRelease(empty);
    glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);

然后你可以直接从支持这个纹理的字节中读取(BGRA 格式,而不是 RGBA 的glReadPixels()),使用类似:

    CVPixelBufferLockBaseAddress(renderTarget, 0);
    _rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
    // Do something with the bytes
    CVPixelBufferUnlockBaseAddress(renderTarget, 0);

但是,如果您只想在 OpenGL ES 中重用您的图像,您只需将场景渲染到支持纹理的 FBO,然后在第二级渲染中使用该纹理。

I show an example of rendering to a texture, and then performing some processing on it, within the CubeExample sample application within my open source GPUImage framework, if you want to see this in action.

于 2012-05-04T20:20:53.853 回答