2

I use Core Video texture cache for my OpenGL textures. I have an issue with rendering such textures in case of minification. Parameter GL_TEXTURE_MIN_FILTER has no effect: interpolation for minification is always the same as GL_TEXTURE_MAG_FILTER. The interesting fact is that everything works ok when I create pixel buffer object with CVPixelBufferCreateWithBytes function. The problem appears when I use CVPixelBufferCreate.

Environment:

  • iOS 7
  • OpenGL ES 2.0
  • iPad mini, iPad 3, iPad 4.

I've developed simple application which demonstrates this issue: https://github.com/Gubarev/iOS-CVTextureCache. The Demo application can render checkerboard (size of cell is 1x1) texture in three modes:

  • Regular OpenGL texture (ok).
  • Core Video texture, pixel buffer created with CVPixelBufferCreate (problem).
  • Core Video texture, pixel buffer created with CVPixelBufferCreateWithBytes (ok).

Texture is rendered two times with slight minification (achieved by using OpenGL viewport smaller than texture):

  • Left image rendered with minification filter GL_NEAREST, magnification filter GL_NEAREST.
  • Right image rendered with minification filter GL_LINEAR, magnification filter GL_NEAREST.

Image below demonstrates proper minificiation in case of regular OpenGL texture. It's clearly visible that setting for minificiation filter takes effect. Same results could be obtained when "CVPixelBufferCreateWithBytes" approach is used. The problem appear in case of "CVPixelBufferCreate" approach: both images minificated with setting for magnification filer (GL_NEAREST in particular).

enter image description here

4

0 回答 0