I have some problematic code using setjmp() behaving differently from
what I expect. I would like to know whether it is a bug in GCC or
whether I am somehow abusing setjmp()/longjmp().
Call the example program below jmp.c and compile with
gcc -Og jmp.c -o jmp
using GCC-4.8.0. Run the program ./jmp and the output is
Returning 1
x = 0, n = 1
Returning 0
x = 42, n = 1
Aborted
Strangely, g() is returning 0 the second time (after longjmp()) but the
return value, assigned to n, equals 1. With other optimization levels or
earlier versions of GCC, the output is what I expect:
Returning 1
x = 0, n = 1
Returning 0
x = 42, n = 0
/* jmp.c */
#include <stdio.h>
#include <stdlib.h>
#include <setjmp.h>
static sigjmp_buf env;
static inline int g(int x)
{
if (x)
{
fprintf(stderr, "Returning 0\n");
return 0;
}
else
{
fprintf(stderr, "Returning 1\n");
return 1;
}
}
int f(int *e)
{
if (*e) return 1;
int x = setjmp(env);
int n = g(x);
fprintf(stderr, "x = %i, n = %i\n", x, n);
if (n == 0) exit(0);
if (x) abort();
longjmp(env, 42);
}
int main(int argc, char** argv)
{
int v = 0;
return f(&v);
}