我注意到 PHP 和 JavaScript 在处理八进制和十六进制数字时遇到了一些困难:
PHP:
echo 16 == '0x10' ? 'true' : 'false'; //true, as expected
echo 8 == '010' ? 'true' : 'false'; //false, o_O
echo (int)'0x10'; //0, o_O
echo intval('0x10'); //0, o_O
echo (int)'010'; //10, o_O
echo intval('010'); //10, o_O
JavaScript:
console.log(16 == '0x10' ? 'true' : 'false'); //true, as expected
console.log(8 == '010' ? 'true' : 'false'); //false, o_O
console.log(parseInt('0x10')); //16, as expected
console.log(parseInt('010')); //8, as expected
console.log(Number('0x10')); //16, as expected
console.log(Number('010')); //10, o_O
我知道 PHP 具有纠正八进制/十六进制错误行为的octdec()
和hexdec()
函数,但我希望它intval()
能够像 JavaScript 一样处理八进制和十六进制数字parseInt()
。
无论如何,这种奇怪行为背后的基本原理是什么?