I have special problem. I use Xamarin.iOS and try to call a function of a DLL library. The original function is defined as
LUALIB_API int luaL_loadbuffer (lua_State *L,
const char *buff,
size_t size,
const char *name)
and imported in C# as
[CLSCompliant (false)]
[DllImport(LIBNAME,CallingConvention=CallingConvention.Cdecl)]
public static extern int luaL_loadbuffer(IntPtr luaState,
string buff,
uint size,
string name);
If you can see, the library is Lua 5.1, the C# side is KeraLua, a wrapper for the library. From Lua you could call this function with a string in buff, like "a=2", which than is compiled to binary code for Lua. But you can also call this function with a binary chunk (precompiled chunk) in buff. Than buff is a byte array.
Now my problem is, that strings work perfect. But binary chunks are converted wrong. I had to convert the byte array into a string, which is in C# encoded in UTF-16. If I than call the function, this "string" is converted from UTF-16 to UTF-8. This is no problem, when buff contains text, but with binary data, the result is wrong.
An example: an integer in the binary data, which looks in the byte array as D6 21 00 00, converted to string with
StringBuilder s = new StringBuilder();
foreach (byte b in byteArray)
s.Append((char)b);
results in D6 21 00 00. Than call the function and get in the library C3 96 21 00 00, which is obviously wrong. This is because UTF-16 D6 is in UTF-8 C3 96 and is changed while calling the function.
Coming to end, how should I define/call the function, so that this conversion didn't happen? Using Encoding.Ansi or Encoding.ASCII didn't work.