On 11/09/2015 02:16 PM, Florian Weimer wrote: > On 11/09/2015 03:08 PM, Andrew Haley wrote: >> On 11/09/2015 12:09 PM, Florian Weimer wrote: >>> On 11/09/2015 11:11 AM, Andrew Haley wrote: >>>> On 08/11/15 19:34, Segher Boessenkool wrote: >>>>> The compiler is free to transform it to >>>>> >>>>> int foo(int x) { >>>>> int t = x*x*x; >>>>> if (x > 1290) { >>>>> printf("X is wrong here %d, but we don't care\n", x); >>>>> } >>>>> return t; >>>>> } >>>>> >>>>> because x*x*x does not have any observable behaviour, and then it is >>>>> obvious it _can_ remove the printf and conditional. >>> >>> I'm not sure if this is a valid transformation for printf, even if >>> targets stdout and does not use any custom format specifiers. Isn't it >>> a cancellation point? But let's assume it's not. >>> >>>> Yes, that is correct. And, indeed, the hardware is free to do taht >>>> too. With speculative execution, the "as if" rule is not limited to >>>> the compiler. >>> >>> Can we disallow that optimization as a quality-of-implementation matter? >>> What would be the benefit of such optimizations, other than >>> discouraging programmers from using C or C++? >> >> There isn't really any way to distinguish between wanted optimizations >> and unwanted ones. > > Of course there is—you define the semantics you want, and then any > optimization which breaks them is a bug. > >> If GCC determines that a statement is unreachable >> it can be deleted, and this depends on its knowledge of UB. Like this: >> >> void foo(int b) { >> if (b > 0) { >> int m = b * 3 / 6; >> if (m < 0) >> die(); >> } >> } >> } >> >> Deleting such unreachable code happens all the time. IMO we should >> not disable this optimization. > > This is very different from the printf example. The call to die is > unreachable according to the standard semantics. The original printf > call is reachable, and according to my interpretation, the > transformation shown above is invalid because the abstract machine > performs the side effect from the printf before undefined behavior is > reached. Here it is again: int foo(int x) { if (x > 1290) { printf("X is wrong here %d, but we don't care\n", x); } return x*x*x; Here, the printf writes to a stream then the UB happens. But the stream is buffered and the UB kills the process before the stream is flushed. There is nothing in the C specification to prevent this, and neither should there be. I don't think it's even possible. >>> I'm worried that this particular line of argument would also allow the >>> movement of undefined behavior which occurs after an infinite loop in >>> front of it, even if this loop performs I/O. >> >> Sure. But it can already do that even if the compiler does not move >> anything. The I/O writes to a stream, the UB causes a segfault which >> kills a process, the stream never gets written. > > Based on my C semantics, the UB is never reached because the loop never > exits. a. You have your own C semantics? b. Which loop? You need to let us look at it. I don't think that there is a program which will exhibit such behaviour. > I just think that C semantics which only deal with terminating programs, > Turing-machine-style, are not very useful for the programs we generally > write. This is an incomprehensible statement. At leas, I don't know what it means. Andrew.