I've injected a DLL into a game process to make a overlay interface, but the problem is that alpha values are being "cropped" (not rendering at all)
I've tested several alpha values and it seems to fail if alpha is below 0.3.
To illustrate what happens, the image that I'm trying to render is:
and the game redering the image is:
What is exactly happening here? Its the current state of opengl? I'm new to the API, and I have no idea why this happens.
More information:
The texture is being created from a buffer with:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, this->width, this->height, GL_BGRA_EXT, GL_UNSIGNED_BYTE, this->buffer);
I receive this buffer from Awesomium, and the values are right... I've checked the alpha values.
The rendering is done using this function (I've tried calling game's texture rendering function too but the same problem happens):
void DrawTextureExt(int texture, float x, float y, float width, float height)
{
glPushAttrib(GL_ALL_ATTRIB_BITS);
{
glPushMatrix();
{
glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
glEnable(GL_BLEND);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
// glBlendFunc(GL_ONE, GL_ONE); << tried this too.. ugly results
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, texture);
glTranslatef(x, y, 0.0);
glRotatef(0, 0.0, 0.0, 1.0);
glTranslatef(-x, -y, 0.0);
glBegin(GL_QUADS);
{
glTexCoord2f(0, 0); glVertex2f(x, y);
glTexCoord2f(0, 1); glVertex2f(x, y + height);
glTexCoord2f(1, 1); glVertex2f(x + width, y + height);
glTexCoord2f(1, 0); glVertex2f(x + width, y);
}
glEnd();
glBindTexture(GL_TEXTURE_2D, 0);
}
glPopMatrix();
}
glPopAttrib();
}