Under gcc2, the following program:
#include <stdio.h>
#define A 1
#define A_Size 4
#define B 2
#define B_Size 8
#define FOO(num,bar) FOO##num##bar;\
printf("\n");
#define FOOX(num,bar) FOO##num##bar;
#define FOO1(bar1) printf("%d", bar1##_Size)
#define FOO2(bar1, bar2) printf("%d %d", bar1##_Size, bar2##_Size)
int main () {
FOO1(A)
FOO(1,(A))
FOOX(1,(A))
FOO2(A,B)
FOO(2,(A,B))
FOOX(2,(A,B))
return 0;
}
preprocessed into:
int main () {
printf("%d", 4 )
printf("%d", 4 ) ; printf("\n");
printf("%d", 4 ) ;
printf("%d %d", 4 , 8 )
printf("%d %d", 4 , 8 ) ; printf("\n");
printf("%d %d", 4 , 8 ) ;
return 0;
}
I know that #define FOO(num,bar) FOO##num##bar;\ is incorrect and it
should be FOO##num bar; under gcc3 FOO##num and bar are intended as
separate tokens.
The problem is if I make that change to FOO and FOOX, then the
preprocessor generates:
int main () {
printf("%d", 4)
printf("%d", 1_Size); printf("\n");
printf("%d", 1_Size);
printf("%d %d", 4, 8)
printf("%d %d", 1_Size, 2_Size); printf("\n");
printf("%d %d", 1_Size, 2_Size);
return 0;
}
Which is to say, the A token gets evaluated at the FOO and FOOX level and
doesn't make it down to the FOO1 and FOO2 level.
Apparently, the extra token paste which was permitted in gcc2 deferred
evaluation of the macro arguments one level.
So my question is how can I get this to work. By this, I mean, having a
series of macros FOO1, FOO2, FOO3, etc.. Then having another series of
macros that call that series and possibly do other stuff.
My alternative (and I'm hoping I can avoid) is to make FOO1, FOOX1, FOO2,
FOOX2, etc..
Is there any way to force gcc3 to behave like gcc2?
Is there any way to tell the preprocessor to defer evaluation of macro
arguments until a specific level or the next level?
Thanks. Please CC my email directly in addition to the ML.