When Dennis Ritchie invented the C Programming language, he suggested that a short int would normally occupy 2 bytes, and a long int would take 4 bytes, and no matter what the hardware, a long should always be longer than a short. That makes sense. However, he was less precise about the simple int. He simply stated that it should reflect the 'natural' size of the hardware. So it might be like a short on one machine, while like a long on another. Am I 'out of date' and 'out of touch'? Machines and compilers have grown in size since Ritchie's day When I check sizeof(short), sizeof(int) and sizeof(long) on my machine I get 2, 2, 4. Yet my machine is a 64bit one. Is that its 'natural' size. Should not my int be 8 bytes ?? Should C and C++ compilers be re-defining shorts, ints and longs? -- View this message in context: http://gcc.1065356.n5.nabble.com/C-Integers-tp935964.html Sent from the gcc - Help mailing list archive at Nabble.com.