I need to convert a (potentially very long) string like char * s = "2f0a3f"
into the actual bytes it represents, when decoded from the hex representation. Currently I'm doing this, but it feels clunky and wrong.
size_t hexlength = strlen(s);
size_t binlength = hexlength / 2;
unsigned char * buffer = malloc(binlength);
long i = 0;
char a, b;
for (; i < hexlength; i += 2) {
a = s[i + 0]; b = s[i + 1];
buffer[i / 2] =
((a < '9' ? a - '0' : a - 'a' + 10) << 4) + (b < '9' ? b - '0' : b - 'a' + 10);
}
Two things strike me as ugly about this:
- The way I'm dividing by two each time I push into the buffer
- The conditional logic to figure out the decimal value of the hex digits
Is there a better way? Preferably not using something I'd have to add a dependency on (since I want to ship this code with minimal cross-platform issues). My bitwise math is awful ;)
NOTE: The data has been pre-validated to all be lowercase and to be a correct string of hex pairs.