Help working out where GCC decides the size of 0xFFFFFFFF

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

I am running GCC 5 on an SH-2 (7058) simulator that I wrote myself.
The simulator works well enough to (mostly) run GCC, however I have
encountered what is probably a bug in my simulator that GCC triggers
and am looking for help to work out where in GCC this bug is
triggered.

Running GCC as cross compiler targeting my platform seems to produce
the correct results.

The bug is that when I run GCC on my target, GCC identifies the
sizeof(0xFFFFFFFF) as 8 bytes, whereas when I run GCC as a cross
compiler targeting my simulator it correctly identifies the size as 4
bytes. The sizeof(0x7FFFFFFF) is identified correctly as 4 bytes on
both.

I am looking for help on narrowing down where GCC makes the decision
on the size of the 0xFFFFFFFF literal so I can investigate further.

So any pointers as to were to start looking for where GCC calculates
the size of a number would help me greatly.

Thanks in advance!

Alex




[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux