If it uses 0x00
to 0x7F
for the "special" characters, how does it encode the regular ASCII characters?
In most of the charsets that support the character Á
, its codepoint is 193 (0xC1
). If you subtract 128 from that, you get 65 (0x41
). Maybe your "codepage" is just the upper half of one of the standard charsets like ISO-8859-1 or windows-1252, with the high-order bit set to zero instead of one (that is, subtracting 128 from each one).
If that's the case, I would expect to find a flag you can set to tell it whether the next bunch of codepoints should be converted using the "upper" or "lower" encoding. I don't know of any system that uses that scheme, but it's the most sensible explanation I can come with for the situation you describe.