Why does the base of a literal affect its type?

0

The decimal number 4294967295 is equal to hexadecimal 0xFFFFFFFF, so I would expect a literal to have the same type regardless of what base it is expressed in, yet

std::is_same<decltype(0xFFFFFFFF), decltype(4294967295)>::value; //evaluates false

It appears that on my compiler decltype(0xFFFFFFFF) is unsigned int, while decltype(4294967295) is signed long.

c++
integer
literals
asked on Stack Overflow Nov 11, 2018 by Chris_F • edited Nov 12, 2018 by Shafik Yaghmour

1 Answer

5

hex literals and decimal literals types are determined differently from lex.icon table 7

The type of an integer literal is the first of the corresponding list in Table 7 in which its value can be represented.

when there is no suffix for decimal literal the types listed are in order:

integer
long int
long long int

for hexidecimal the list in order are:

int
unsigned int
long int
unsigned long int
long long int unsigned long long int

Why does this difference exist? Considering we also have this in C, we can look at the C99 rationale document and it says:

Unlike decimal constants, octal and hexadecimal constants too large to be ints are typed as unsigned int if within range of that type, since it is more likely that they represent bit patterns or masks, which are generally best treated as unsigned, rather than “real” numbers.

answered on Stack Overflow Nov 11, 2018 by Shafik Yaghmour • edited Nov 12, 2018 by Shafik Yaghmour

User contributions licensed under CC BY-SA 3.0