I only have so many variables exposed through shaders. One of those values is right now for color, but I'm bastardizing it right now and in the shader using it for something else. However I need more. Right now I'm using the x & y values for index values to lookup a subtexture on 1 big texture, which the shader will offset to. The z value of the color is for how many subtextures fit on the width of the big main texture. Because they are index values they only really need to be ints and even char up to 256 would be fine.
Now I want to be able to provide a scale value and a subset offset from the subtexture to give me full control. Since I'll always need the z float value of the color and need it to be a float that variable is out. That leaves me with x, y, a (alpha) float values. Is there any way I could encode byte values inside these so that when I decode them in the shader x actually stores 2 values, y stores 2 values, and alpha stores 2 values?
I hope that made sense. I want to be able to encode values into a floating point variable via my game code, then decode them in the shader and use them in the shader. Is this possible and if so what kind of value restrictions would we be looking at and how would you do it?