Re: [PATCH] drm: lcdif: Set and enable FIFO Panic threshold

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 10/27/22 10:32, Marco Felsch wrote:

Hi,

diff --git a/drivers/gpu/drm/mxsfb/lcdif_kms.c b/drivers/gpu/drm/mxsfb/lcdif_kms.c
index a5302006c02cd..aee7babb5fa5c 100644
--- a/drivers/gpu/drm/mxsfb/lcdif_kms.c
+++ b/drivers/gpu/drm/mxsfb/lcdif_kms.c
@@ -341,6 +341,18 @@ static void lcdif_enable_controller(struct lcdif_drm_private *lcdif)
   	reg = readl(lcdif->base + LCDC_V8_CTRLDESCL0_5);
   	reg |= CTRLDESCL0_5_EN;
   	writel(reg, lcdif->base + LCDC_V8_CTRLDESCL0_5);
+
+	/* Set FIFO Panic watermarks, low 1/3, high 2/3 . */
+	writel(FIELD_PREP(PANIC0_THRES_LOW_MASK, 1 * PANIC0_THRES_RANGE / 3) |
+	       FIELD_PREP(PANIC0_THRES_HIGH_MASK, 2 * PANIC0_THRES_RANGE / 3),
+	       lcdif->base + LCDC_V8_PANIC0_THRES);
+
+	/*
+	 * Enable FIFO Panic, this does not generate interrupt, but
+	 * boosts NoC priority based on FIFO Panic watermarks.
+	 */
+	writel(INT_ENABLE_D1_PLANE_PANIC_EN,
+	       lcdif->base + LCDC_V8_INT_ENABLE_D1);

Out of curiosity since we have a patch doing the exact same thing but
didn't saw any improvements. Is there a reason why you enabled it here?

It seems like the right thing to do here, when enabling the controller.

We did this during lcdif_rpm_resume(). But as I said with a 1080P
display we still saw the flickering, it disappeared first after rising
the burst-size.

That's what the NXP downstream driver does too, isn't it ? That seems like
the wrong place to me.

Yes, I think so. It's not about the place (if it wrong/correct) it is
more about the PANIC mode itself. I'm curios about:
  1) Do you still see the flickering with this patch and without the
     "burst-size increase" patch?

No

  2) Do you still saw flickering after the "burst-size increase" patch
     applied and without this patch?

I did not try

I had no 4K display therefore I'm asking, but with 1080P we didn't saw
any improvements without increasing the burst-size. My assumption was:
if the panic mode does work, than I don't have to increase the
burst-size.

I believe the burst size optimizes the DRAM access pattern -- longer bursts mean the DRAM controller can do longer sustained transfers from the DRAM, which means fewer gaps between transfers for the same amount of data, which means you can use the newly available spare bandwidth for some other transfer.



[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux