On 31. 10. 22, 18:57, Tejun Heo wrote:
On Mon, Oct 31, 2022 at 05:24:28AM -0700, Christoph Hellwig wrote:
On Mon, Oct 31, 2022 at 12:45:20PM +0100, Jiri Slaby (SUSE) wrote:
Cast the enum members to int when printing them.
Alternatively, we can cast them to ulong (to silence gcc < 12) and use %lu.
Alternatively, we can move VTIME_PER_SEC away from the enum.
Yes, either split the enum or just use a define. But casts are a big
code smell and should be avoided if there is a reasonable alternative.
enums are so much better for debugging and other instrumentation stuff. The
only requirement for the enum types is that they're big enough to express
all the members and we can use whatever printf format letter which matches
the type in use. The problem here is that the compiler behavior is different
depending on the compiler version, which kinda sucks.
Yes. The real problem is that using anything else then an INT_MIN <= x
<= INT_MAX _constant_ in an enum is undefined in ANSI C < 2x (in
particular, 1 << x is undefined too). gcc manual defines unsigned int on
the top of that as defined too (so this holds for our -std=g*).
I suppose the most reasonable thing to do here is just splitting them into
separate enum definitions. Does anyone know how this behavior change came to
be?
C2x which introduces un/signed long enums. See the bug I linked in the
commit log:
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=36113
The change is also turned on in < C2x on purpose. AIUI, unless there is
too much breakage. So we'd need to sort it out in (rather distant)
future anyway (when we come up to -std=g2x).
Do we know whether clang is gonna be changed the same way?
In C2x, Likely. In < C2x, dunno what'd be the default.
thanks,
--
js
suse labs