On Wed, Oct 9, 2013 at 3:24 PM, Jonathan Wakely <jwakely.gcc@xxxxxxxxx> wrote: > On 9 October 2013 10:36, vijay nag wrote: >> Hello GCC, >> >> I'm facing a wierd compiler optimization problem. Consider the code >> snippet below >> >> #include <stdio.h> >> >> int printChar(unsigned long cur_col, unsigned char c) >> { >> char buf[256]; >> char* bufp = buf; >> char cnt = sizeof(buf) - 2; /* overflow in implicit type conversion */ >> unsigned long terminal_width = 500; >> >> while ((cur_col++ < terminal_width) && cnt) { >> *bufp++ = c; >> cnt--; >> } > > >> Basically the crash here is because of elimination of the check in the >> if-clause "&& cnt" which is causing stack overrun and thereby SIGSEGV. >> While standards may say that the behaviour is >> undefined when an unsigned value is stored in a signed value, > > Standards do not say that. 254 cannot be presented in a char if char > is a signed type, so it's an overflow, which is undefined behaviour. > Storing an unsigned value that doesn't overflow is OK. > >> can a >> language lawyer explain to me why GCC chose to eliminate code >> pertaining to cnt considering it as dead-code ? > > cnt is initialized to -2 (after an overflow) and then you decrement it > so it gets more negative. The "&& cnt" condition will never be false, > because cnt starts non-zero and gets further from zero, so will never > reach zero. Alright that is perfectly valid behaviour. Why does compiler consider it to be a unsigned type at optimization level zero ? i.e. I see a wrap around after -128 to 128 ?