I'm confused about the parsing of the "L" which should denote a long
integer literal. I'm running GCC 4.5.3 on a 32-bit machine, i.e. long
long = 64 bits and long = 32 bits. Am I misunderstanding the precedence
of "L" or how it's parsed/converted?
====================================================================
Sample code:
#include <stdio.h>
int main(void)
{
long long x = 10;
long long y = (0xffffffffL);
long long z = (long)(0xffffffffL);
printf("long long x == %lld\n", x);
printf("long long y == %lld\n", y);
printf("long long z == %lld\n", z);
printf("0xffffffffL == %ld\n", 0xffffffffL);
if (x > (long)(0xffffffffL))
printf("x > (long)(0xffffffffL)\n");
else
printf("x <= (long)(0xffffffffL)\n");
if (x > (0xffffffffL))
printf("x > (0xffffffffL)\n");
else
printf("x <= (0xffffffffL)\n");
return 0;
}
====================================================================
Output:
long long x == 10
long long y == 4294967295
long long z == -1
0xffffffffL == -1
x > (long)(0xffffffffL)
x <= (0xffffffffL)
====================================================================
System version (checked on both Cygwin and Debian Linux):
$ gcc --version; uname -a
gcc (GCC) 4.5.3
Copyright (C) 2010 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
CYGWIN_NT-6.1 FOOBAR 1.7.17(0.262/5/3) 2012-10-19 14:39 i686 Cygwin