On Thu, Feb 29, 2024 at 12:49:57PM +0200, Andy Shevchenko wrote:
On Wed, Feb 28, 2024 at 05:39:21PM -0600, Lucas De Marchi wrote:
On Thu, Feb 22, 2024 at 06:49:59AM -0800, Yury Norov wrote:
> On Wed, Feb 21, 2024 at 03:59:06PM -0600, Lucas De Marchi wrote:
> > On Wed, Feb 21, 2024 at 11:04:22PM +0200, Andy Shevchenko wrote:
> > > On Wed, Feb 21, 2024 at 10:30:02PM +0200, Dmitry Baryshkov wrote:
> > > > On Thu, 8 Feb 2024 at 09:45, Lucas De Marchi <lucas.demarchi@xxxxxxxxx> wrote:
...
I build-tested this in x86-64, x86-32 and arm64. I didn't like much the
need to fork the __GENMASK() implementation on the 2 sides of the ifdef
since I think the GENMASK_INPUT_CHECK() should be the one covering the
input checks. However to make it common we'd need to solve 2 problems:
the casts and the sizeof. The sizeof can be passed as arg to
__GENMASK(), however the casts I think would need a __CAST_U8(x)
or the like and sprinkle it everywhere, which would hurt readability.
Not pretty. Or go back to the original submission and make it less
horrible :-/
I'm wondering if we can use _Generic() approach here.
in assembly?
...
> #define GENMASK_INPUT_CHECK(h, l) 0
> +#define __GENMASK(t, h, l) \
> + ((~0 - (1 << (l)) + 1) & (~0 >> (BITS_PER_LONG - 1 - (h))))
humn... this builds, but does it work if GENMASK_ULL() is used in
assembly? That BITS_PER_LONG does not match the type width.
UL()/ULL() macros are not just for fun.
they are not for fun, but they expand to a nop in assembly. And it's up
to the instruction used to be the right one. Since this branch is for
assembly only, having them wouldn't really change the current behavior.
I'm talking about BITS_PER_LONG vs BITS_PER_LONG_LONG. That introduces
a bug here.
Lucas De Marchi
--
With Best Regards,
Andy Shevchenko