> JimJoyce wrote: > > When Dennis Ritchie invented the C Programming language, he > suggested that a > > short int would normally occupy 2 bytes, and a long int would take 4 > bytes, > > and no matter what the hardware, a long should always be longer than > a > > short. That makes sense. > > However, he was less precise about the simple int. He simply stated > that it > > should reflect the 'natural' size of the hardware. So it might be > like a > > short on one machine, while like a long on another. http://pubs.opengroup.org/onlinepubs/009695399/basedefs/stdint.h.html I think that coves all the issues.