On Tue, Oct 12, 2021 at 4:25 PM Peter Oskolkov <posk@xxxxxxx> wrote: [...] > +static inline int __try_xchg_user_32(u32 *oval, u32 __user *uaddr, u32 newval) > +{ > + u32 oldval = 0; > + int ret = 0; > + > + asm volatile("\n" > + "1:\txchgl %0, %2\n" > + "2:\n" > + "\t.section .fixup, \"ax\"\n" > + "3:\tmov %3, %0\n" I believe the line above should be "mov %3, %1", not "mov %3, %0". I'll fix this in the next patchset mailing. > + "\tjmp 2b\n" > + "\t.previous\n" > + _ASM_EXTABLE_UA(1b, 3b) > + : "=r" (oldval), "=r" (ret), "+m" (*uaddr) > + : "i" (-EFAULT), "0" (newval), "1" (0) > + ); > + > + if (ret) > + return ret; > + > + *oval = oldval; > + return 0; > +} > + > +static inline int __try_xchg_user_64(u64 *oval, u64 __user *uaddr, u64 newval) > +{> +> + > + u64 oldval = 0; > + int ret = 0; > + > + asm volatile("\n" > + "1:\txchgq %0, %2\n" > + "2:\n" > + "\t.section .fixup, \"ax\"\n" > + "3:\tmov %3, %0\n" Same here. [...]