This seems to be a rather straightforward problem (or at least as close as you can get). When I pass GL_RGB8, GL_RGBA8, or most any multiple-channel internal format, the below line produces no error.
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB8, fontImgSize, fontImgSize, 0, GL_RED, GL_UNSIGNED_BYTE, NULL);
However, when I pass GL_R8 or any 8-bit one-channel variant, glGetError returns 0x501 (Invalid Value).
glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, fontImgSize, fontImgSize, 0, GL_RED, GL_UNSIGNED_BYTE, NULL);
Any idea what's going on here? The computer I'm presently using is rather outdated and low-power, but if that were the problem I doubt RGB8 or RGBA8 would work.
For those curious, fontImgSize == 2048, the maximum texture size on my computer.
Edit: It appears GL_RG8 and 2-channel/16-bit formats also produce a 0x501