Dan McGee <dpmcgee@xxxxxxxxx> writes: > diff --git a/color.h b/color.h > index c0528cf..a7da793 100644 > --- a/color.h > +++ b/color.h > @@ -53,6 +53,10 @@ struct strbuf; > */ > extern int git_use_color_default; > > +extern const char *column_colors_ansi[13]; > + > +/* Ignore the RESET at the end when giving the size */ > +#define COLUMN_COLORS_ANSI_MAX (ARRAY_SIZE(column_colors_ansi) - 1) Sneaky. I first went "Huh? -- this array-size macro cannot work", expecting that the array is not decleared with a fixed size in the header. It may make sense to unify these two palettes whose slot assignment does not have any meaning, but it feels that the above change totally goes against the spirit of using ARRAY_SIZE() macro, the point of which is to liberate programmers from having to count and adjust the size when adding the contents to the array. Wouldn't it make more sense to do something like >>> in the header <<< extern const char *custom_colors_ansi[]; extern const int CUSTOM_COLORS_ANSI_MAX; >>> in the code <<< const char *custom_colors_ansi[] = { ... as before ... }; /* Does not count the last element "RESET" */ const int CUSTOM_COLORS_ANSI_MAX = ARRAY_SIZE(custom_colors_ansi) - 1; to avoid mistakes? -- To unsubscribe from this list: send the line "unsubscribe git" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html