For some reason, programming languages seem to have a hard time with numbers. This seems extraordinarily odd, given that so much of programming is about numbers, but there we are. Consider C's integral types: the various widths of signed and unsigned integer that it supports…
Because C was designed in a more primitive time, it was important to have an efficient representation for integers on a wide variety of machines, including wacky machines with 9-bit bytes and ones-complement arithmetic. Thus, C's long
and unsigned long
, int
and unsigned int
, short
and unsigned short
are pretty much constrained only by the requirements that the larger representation sizes be at least as large as the smaller rep sizes, and that the signed and unsigned version of the same type have the same rep size; char
and unsigned char
are required to be one byte. For complicated historical reasons, it long ago became pretty standard for int
and long
to be the same rep size, typically 32 bits, annoyingly leading to the need for a long long
rep size in some cases.
More recently, <inttypes.h>
became standardized. If you're not using this for integral types anytime you care about rep size, you are doing it wrong in 2014. The rest of this post will use the types defined there.