I have a code that changes two sets of hex numbers and then stores them into a new unsigned char. The code looks like the following:
unsigned char OldSw = 0x1D;
unsigned char NewSw = 0xF0;
unsgined char ChangedSw;
ChangedSw = (OldSw ^ ~NewSw) & ~OldSw;
So what I know is:
0x1D = 0001 1101
0xF0 = 1111 0000
Im confused on what the changedSw line is doing. I know it will give the output 0x02 but I can not figure out how its doing it.