On Wed, Jan 27, 2021 at 04:28:38PM +0000, David Laight wrote: > I'd definitely leave the type as a bitmap. What the hell for? Microoptimizations in places where we have much heavier stuff to be done are bloody pointless. It's already overcomplicated. And compiler is _not_ going to be able to prove that we'll only ever have one bit set, so you would be making it harder to optimize, not to mention reason about.