On Sat, Oct 12, 2024 at 05:05:19PM -0400, Sasha Levin wrote: > On Thu, Oct 10, 2024 at 06:29:21PM +0000, Carlos Llamas wrote: > > On Wed, Oct 02, 2024 at 12:07:26PM +0200, gregkh@xxxxxxxxxxxxxxxxxxx wrote: > > > > > > The patch below does not apply to the 5.4-stable tree. > > > If someone wants it applied there, or to any other stable or longterm > > > tree, then please email the backport, including the original git commit > > > id to <stable@xxxxxxxxxxxxxxx>. > > > > > > To reproduce the conflict and resubmit, you may use the following commands: > > > > > > git fetch https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux.git/ linux-5.4.y > > > git checkout FETCH_HEAD > > > git cherry-pick -x a6f88ac32c6e63e69c595bfae220d8641704c9b7 > > > # <resolve conflicts, build, test, etc.> > > > git commit -s > > > git send-email --to '<stable@xxxxxxxxxxxxxxx>' --in-reply-to '2024100226-unselfish-triangle-e5eb@gregkh' --subject-prefix 'PATCH 5.4.y' HEAD^.. > > > > > > Possible dependencies: > > > > > > a6f88ac32c6e ("lockdep: fix deadlock issue between lockdep and rcu") > > > 61cc4534b655 ("locking/lockdep: Avoid potential access of invalid memory in lock_class") > > > 248efb2158f1 ("locking/lockdep: Rework lockdep_lock") > > > 10476e630422 ("locking/lockdep: Fix bad recursion pattern") > > > 25016bd7f4ca ("locking/lockdep: Avoid recursion in lockdep_count_{for,back}ward_deps()") > > > > > > thanks, > > > > > > greg k-h > > > > These 3 commits are the actual dependencies: > > > > [1] 61cc4534b655 ("locking/lockdep: Avoid potential access of invalid memory in lock_class") > > [2] 248efb2158f1 ("locking/lockdep: Rework lockdep_lock") > > [3] 10476e630422 ("locking/lockdep: Fix bad recursion pattern") > > > > It seems to me that [1] and [3] are fixes we would also want in 5.4. > > Possibly also [2] just to make the cherry-picks cleaner. If there are no > > objections I can send a patchset for linux-5.4.y with all 4? > > SGTM! OK, I've sent the patches here: https://lore.kernel.org/all/20241012232244.2768048-1-cmllamas@xxxxxxxxxx/ Cheers, Carlos Llamas