I don't believe you can sample directly from a Pixel Buffer Object.
One obvious option is to use a regular texture instead of a Texture Buffer Object. The maximum texture size of ES 3.0 compatible iOS devices is 4096 (source: https://developer.apple.com/library/iOS/documentation/DeviceInformation/Reference/iOSDeviceCompatibility/OpenGLESPlatforms/OpenGLESPlatforms.html). There are a few sub-cases depending on how big your data is. With n
being the number of texels:
- If
n
is at most 4096, you can store it in a 2D texture that has size n x 1
.
- If
n
is more than 4096, you can store it in a 2D texture of size 4096 x ((n + 4095) / 4096)
.
In both cases, you can still use texelFetch()
to get the data from the texture. In the first case, you sample (i, 0)
to get value i
. In the second case, you sample (i % 4096, i / 4096)
.
If you already have the data in a buffer, you can store it the texture by binding the buffer as GL_PIXEL_UNPACK_BUFFER
before calling glTexImage2D()
, which will then source the data from the buffer.
Another option to consider are Uniform Buffer Objects. This allows you to bind the content of a buffer to a uniform block, which then gives you access to the values in the shader. Look up glBindBuffer()
, glBindBufferBase()
, glBindBufferRange()
with the GL_UNIFORM_BUFFER
target for details. The maximum size of a uniform block in bytes is given by GL_MAX_UNIFORM_BLOCK_SIZE
, and is 16,384 on iOS/A7 devices.