3

As the title states, lldb reports the value of UInt.max to be a UInt of -1, which seems highly illogical. Considering that let uint: UInt = -1 doesn't even compile, how is this even possible? I don't see any way to have a negative value of UInt at runtime because the initializer will crash if given a negative value. I want to know the actual maximum value of UInt.

4

2 回答 2

5

The Int value of -1 and the UInt value UInt.max have the same bit representation in memory.

You can see that if you do:

let i = Int(bitPattern: UInt.max)  // i == -1

and in the opposite direction:

if UInt(bitPattern: Int(-1)) == UInt.max {
    print("same")
}

Output:

same

The debugger is incorrectly displaying UInt.max as a signed Int. They have the same bit representation in memory (0xffffffffffffffff on a 64-bit system such as iPhone 6 and 0xffffffff on a 32-bit system such as iPhone 5), and the debugger apparently chooses to show that value as an Int.

You can see the same issue if you do:

print(String(format: "%d", UInt.max))  // prints "-1"

It doesn't mean UInt.max is -1, just that both have the same representation in memory.


To see the maximum value of UInt, do the following in an app or on a Swift Playground:

print(UInt.max)

This will print 18446744073709551615 on a 64-bit system (such as a Macintosh or iPhone 6) and 4294967295 on a 32-bit system (such as an iPhone 5).

In lldb:

(lldb) p String(UInt.max)
(String) $R0 = "18446744073709551615"
(lldb) 
于 2016-05-22T03:47:06.427 回答
3

This sounds like an instance of transitioning between interpreting a literal value under Two's Complement and Unsigned representations.

In the unsigned world, a binary number is just a binary number, so the more digits that are 1, the bigger the number, since there is no need to encode the sign somehow. In order to represent the largest number of signed values, the Two's Complement encoding scheme encodes positive values as normal provided the most extreme bit is not a 1. If the most extreme bit is a 1, then the bits are reinterpreted as described at https://en.wikipedia.org/wiki/Two%27s_complement.

As shown on Wikipedia, the Two's Complement representation of -1 has all bits set to 1, or the maximal unsigned value.

于 2016-05-22T01:49:18.317 回答