2

In my OpenGL application on Ubuntu (12.10), I've issued the following GLFW window hints when creating a window:

glfwWindowHint(GLFW_CLIENT_API, GLFW_OPENGL_API);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 2);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
glfwWindowHint(GLFW_VISIBLE, GL_TRUE);
glfwWindowHint(GLFW_SAMPLES, 0);
glfwWindowHint(GLFW_RED_BITS, 24);
glfwWindowHint(GLFW_GREEN_BITS, 24);
glfwWindowHint(GLFW_BLUE_BITS, 24);
glfwWindowHint(GLFW_ALPHA_BITS, 8);

// Create Opengl Window
window = glfwCreateWindow(width, height, windowTitle.c_str(), NULL, NULL);

centerWindow();
glfwMakeContextCurrent(window);

But this results in an OpenGL context being created with an associated window with pixel color bit depth of R-G-B-A = 8-8-8-8. To check this I used the following code after creating my GLFW window:

int count;
const GLFWvidmode * mode = glfwGetVideoMode(monitor);

cout << "Current video mode: " <<
        mode->redBits << "-" <<
        mode->greenBits << "-" <<
        mode->blueBits << endl;

cout << "All possible video modes: " << endl;
mode = glfwGetVideoModes(monitor, &count);
for(int i = 0; i < count; i++) {
    cout << mode->redBits << "-" <<
            mode->greenBits << "-" <<
            mode->blueBits << endl;
}

Surprisingly I get 8-8-8-8 for my current video mode, and for all possible video modes. I'm sure this cannot be the case, as my monitor (Samsung S23B550) can display mono-colored gradients without any mach banding issues, which means it should be at least 16-24 bit depth per color channel. I'm also using a modern graphics card (Nvidia GT650M), which should have a framebuffer that supports between 24-32bits per channel.

The only odd thing to take into account is that my graphics card uses the Nvidia Optimus technology, which means that if I want to use my dedicated graphics card on Linux, I have to use optirun (Bumblee) for graphics card switching (integrated-to-dedicated), which I do when I run my OpenGL applications.

4

1 回答 1

0

Nvidia 说你必须为 Quadro 掏腰包才能在 OpenGL 中使用 10 位颜色。

并不是说AMD 在这方面做得更好

于 2013-10-25T19:24:07.537 回答