I'm using OpenGL to plot an RGB image histogram. As this is an 8-bit image histogram, my data-set contains data points from zero to 255.
If I plot the histogram without using glScaled() then the graph plots as expected but, of course, does not fill the allocated area (the width of which is variable, the height constant). However, when I use glScaled() the graph shows strange artefacts.
Please see the following images to see an example of the problem:
The image above shows the histogram as plotted with 256 data points without being scaled with glScaled().
The two images above shows the histogram as plotted with 256 data points AND being scaled with glScaled(). The strange artefacts are evident (missing data?). Please note, the third histogram is a slightly different shape due to changing light levels.
Here is the relevant part of my OpenGL initialisation code:
glViewport(0, 0, width, height);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0.0f, width, height, 0.0f, 0.0f, 1.0f);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
// If this line is removed then the graph plots correctly
// m_scale_factor = width / 256
glScaled(m_scale_factor, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
And here is the relevant part of my plot code:
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE);
glBegin(GL_LINE_STRIP);
for (int n = 0; n < m_histogram_X; n++)
{
glColor4ub(255, 0, 0, 255);
glVertex2i(n, m_Hist_Channel_R[n]);
glVertex2i(n, GRAPH_HEIGHT);
glColor4ub(0, 255, 0, 255);
glVertex2i(n, m_Hist_Channel_G[n]);
glVertex2i(n, GRAPH_HEIGHT);
glColor4ub(0, 0, 255, 255);
glVertex2i(n, m_Hist_Channel_B[n]);
glVertex2i(n, GRAPH_HEIGHT);
}
glEnd()
...
At this stage I feel that I must state that I am new to OpenGL, so it's possible that I have misunderstood many OpenGL things...
My question is: Is it possible to fix this problem in OpenGL or will I have to increase the number of data points by some kind of interpolation and then plot without scaling?
I appreciate any help offered.