I downloaded this Apple sample code GLEssentials sample code. I want to perform some experiments with depth buffer, so at first I decided to check BUFFER_BITS.
I added next code to OpenGLRenderer.m in -initWithDefaultFBO method:
// code from sample
NSLog(@"%s %s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));
// buffer bits check
GLint depthBits;
glGetIntegerv(GL_DEPTH_BITS, &depthBits);
printf("depthBits: %d\n", depthBits);
I had next output:
GLEssentials[3630:112826] Apple Software Renderer OpenGL ES 2.0 APPLE-12.4.2
depthBits: 24
but in ES2Renderer.m I see next line:
// using 16-bit depth buffer
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);
Why its happened? Is it bug?
PS: I tested only in iOS simulator, because I haven't ios device.