token.h contains this: > /* Combination tokens */ > #define COMBINATION_STRINGS { \ > "+=", "++", \ > "-=", "--", "->", \ > "*=", \ > "/=", \ > "%=", \ > "<=", ">=", \ > "==", "!=", \ > "&&", "&=", \ > "||", "|=", \ > "^=", "##", \ > "<<", ">>", "..", \ > "<<=", ">>=", "...", \ > "", \ > "<", ">", "<=", ">=" \ > } > > extern unsigned char combinations[][3]; tokenize.c contains this: > const char *show_special(int val) > { > static const char *combinations[] = COMBINATION_STRINGS; > static char buffer[4]; > > buffer[0] = val; > buffer[1] = 0; > if (val >= SPECIAL_BASE) > strcpy(buffer, combinations[val - SPECIAL_BASE]); > return buffer; > } [...] > unsigned char combinations[][3] = COMBINATION_STRINGS; Apart from triggering a -Wshadow warning, this seems somewhat wasteful. The reason appears to relate to the presence or absence of a '\0' terminator at the end of each item. Does that matter? Could show_special change somehow to avoid the duplication? Alternatively, could the global version just include '\0' terminators? - Josh Triplett
Attachment:
signature.asc
Description: OpenPGP digital signature