0

I downloaded this Apple sample code GLEssentials sample code. I want to perform some experiments with depth buffer, so at first I decided to check BUFFER_BITS.

I added next code to OpenGLRenderer.m in -initWithDefaultFBO method:

// code from sample
NSLog(@"%s %s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));

// buffer bits check
GLint depthBits;
glGetIntegerv(GL_DEPTH_BITS, &depthBits);
printf("depthBits: %d\n", depthBits);

I had next output:

 GLEssentials[3630:112826] Apple Software Renderer OpenGL ES 2.0 APPLE-12.4.2 
 depthBits: 24

but in ES2Renderer.m I see next line:

// using 16-bit depth buffer
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);

Why its happened? Is it bug?

PS: I tested only in iOS simulator, because I haven't ios device.

4

1 回答 1

1

规范说:

OpenGL ES 实现可以根据任何 RenderbufferStorage 参数(目标除外)改变其内部组件分辨率的分配,但分配和选择的内部格式不能是任何其他状态的函数,并且一旦建立就不能更改。可以通过GetRenderbufferParameteriv 查询分配的图像的每个分量的实际分辨率(以位为单位)。

所以基本上,OpenGLES 可以从请求的内容中选择不同的位深度。

我怀疑在设备上,会使用一个实际的 16 位深度缓冲区。

于 2017-02-10T07:53:57.500 回答