当我阅读 K&R 时,我对这段代码感到困惑:
#include "syscalls.h"
int getchar(void)
{
char c;
return (read(0, &c, 1) == 1) ? (unsigned char)c : EOF;
}
据说 unsigned char 用于避免代码中符号扩展带来的错误。这是我能想到的唯一情况,我给出这个示例代码:
char c = 0xf0; //11110000, just make highest bit > 1
printf("%i\n",(int)(unsigned char)c);
printf("%i\n",(int)c);
Output: 240 // 0...011110000
-16 // 1...111110000
但实际上ascii只是0~127的最高位不能分配给1。为什么在K&R cast char >> unsigned char?