According to all ISO C standards, all sizes are measured in multiples of the size of a char
. This means, by definition sizeof(char) == 1
. The size of a char
in bits is defined by the macro CHAR_BIT
in <limits.h>
(or <climits>
). The minimum size of a char
is 8 bits.
Additional type restrictions are:
sizeof(char) <= sizeof(short int) <= sizeof(int) <= sizeof(long int)
int
must be able to represent -32767 to +32767 - e.g. it must be at least 16 bits wide.
C99 added long long int
, who's size is larger or equal to long int
.
The rest is implementation-dependant. This means that the C compiler in question gets to choose how large exactly the numbers are.
How do the C compilers choose these sizes?
There are some common conventions most compilers follow. long
is often chosen to be as large as a machine word. On 64bits machines where CHAR_BIT == 8
(this is nearly always the case, so I'll assume that for the rest of this answer) this means sizeof(long) == 8
. On 32 bit machines this means sizeof(long) == 4
.
int
is almost always 32 bits wide.
long long int
is often 64 bits wide.