1

Let say I have a function where two of the parameters are global arrays. Code would look something like this:

/*global variable*/
int    Input_Messages          [Num_Messages ][Num_Elements];

static void Set_Conditions()
{
     PACK(250.0, (unsigned short*)&Input_Messages[0][15], (unsigned short*)&Input_Messages[0][16], 18, 19, -1, 1.0);
}

void PACK(float value, unsigned short *Hi, unsigned short *Lo, long bit, long word, long lsb, float scale)
{
     unsigned long Value32;
     .
     .
     .
     /*last bit of code where *Hi and *Lo are set*/
     *Lo = (unsigned short)(Value32);
     *Hi = ((unsigned short)(((Value32) >> 16) & 0xFFFF));
}

The problem I am facing is as follows:

After PACK() is completed, it returns the values of *Lo and *Hi to the Input_Messages[0][15] and [0][16] respectively. The issue I am having on a big endian machine is that after all this is complete, if you look at those values for Input_Messages, they have been converted to a long, whereas I need a short. An example is if at the end of PACK(), *Lo is set to 62 (003E), when it returns back into Set_Conditions(), if you look at the Input_Messages[0][16], instead of 62, it would be 4063232 (003E0000).

I managed to fix this by changing the global Input_Messages[][] from int, to unsigned short, but in doing that, a lot of other changes are needed in function calls, etc. I guess I am trying to see if there is an easier way to set it so it returns as a short, and stays a short. Some sort of casting or something.

Let me know if more explanation is needed. I tried to explain it best as I could, and post relevant code, but I will answer any questions I can.

4

0 回答 0