Re: imx8mm lcdif->dsi->adv7535 no video, no errors

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Dave,

On 22-08-04, Dave Stevenson wrote:
> Hi Marco
> 
> On Thu, 4 Aug 2022 at 11:28, Marco Felsch <m.felsch@xxxxxxxxxxxxxx> wrote:
> >
> > On 22-08-03, Dave Stevenson wrote:
> > > On Wed, 3 Aug 2022 at 13:31, Adam Ford <aford173@xxxxxxxxx> wrote:
> >
> > ...
> >
> > > > Mine also states the DSI source needs to provide correct video timing
> > > > with start and stop sync packets.
> > > >
> > > > If I remember correctly, it seemed like Marek V wanted the hard coded
> > > > samsung,burst-clock-frequency to go away so the clock frequency could
> > > > be set dynamically.
> > >
> > > I've never worked with Exynos or imx8, but my view would be that
> > > samsung,burst-clock-frequency should only be used if
> > > MIPI_DSI_MODE_VIDEO_BURST is set in the mode_flags (it isn't for
> > > adv7533/5).
> >
> > Some notes on that. The samsung,burst-clock-frequency is the
> > hs-bit-clock-rate which is twice the dsi-clock-rate. This has nothing to
> > do with the MIPI_DSI_MODE_VIDEO_BURST.
> >
> > > Without that flag the DSI link frequency should be running at the rate
> > > defined by the mode clock, number of lanes, bpp, etc.
> >
> > IMHO the DSI link have only to guarantee the bandwidth is sufficient for
> > the mode.
> 
> DSI spec 8.11.3 Non-Burst Mode with Sync Events
> "This mode is a simplification of the format described in Section
> 8.11.2 (Non-Burst Mode with Sync Pulses)
> ...
> Pixels are transmitted at the same rate as they would in a
> corresponding parallel display interface such as DPI-2."
> 
> If you are running the DSI clock at anything other than that rate,
> then AIUI you are in a burst mode (although you may choose not to drop
> into LP mode).

Yes, that makes sense to me. The bandwidth on the DSI side should match
the one required on the other side (HDMI). Apart the fact that the ADV
is working in mode 8.11.2 (Non-Burst Mode with Sync Pulses).

> (One of my pet peeves that there is no documentation as to exactly
> what MIPI_DSI_MODE_VIDEO_BURST is meant to mean. Seeing as in the DSI
> spec all modes of 8.11 say that the host can drop to LP during
> blanking if time allows, it surely has to be the time compression
> element of 8.11.4 Burst Mode).

Hm.. I don't have the DSI spec either but I thought that BURST mode
allows the host to send the data as fast as possible and enter LP
afterwards.

> > > From the DSI spec (v 1.1 section 8.11.1):
> > > "Non-Burst Mode with Sync Pulses – enables the peripheral to
> > > accurately reconstruct original video timing, including sync pulse
> > > widths."
> > > "RGB pixel packets are time-compressed, leaving more time during a
> > > scan line for LP mode (saving power) or for multiplexing other
> > > transmissions onto the DSI link."
> > > How can the peripheral reconstruct the video timing off a quirky link frequency?
> >
> > If the ADV couldn't reconstruct the sync signals, then we should not get
> > any mode working but we get the 1080P mode working.
> >
> > > Unless the Exynos DSIM_CONFIG_REG register bit DSIM_BURST_MODE [1]
> > > reconfigures the clock setup of the DSI block, then I don't see how
> > > the Exynos driver can follow the DSI spec in that regard.
> >
> > Why do you think that the Exynos driver isn't following the spec? We
> > configure the host into video mode with sync signals which is working
> > for the 1080P mode.
> 
> 1080p is working with samsung,burst-clock-frequency setting?

Yes.

> As I say, I've not worked with this IP, I'm only looking at it from
> the outside having spent far too much time recently on the Pi DSI
> interface.

Good to know :)

> exynos_drm_dsi.c seems to be doing a lot of PLL computation around
> burst-clock-frequency, and nothing with the pixel clock rate.

Yes currently there is just this setting for setting the PLL freq. but
as you said for the "Non-Burst Mode with Sync Pulses" we need to
reconfigure it according the required bandwidth or the dsi-device tells
us about which dsi-link settings should be applied.

> Without knowledge of what that DSIM_BURST_MODE bit in DSIM_CONFIG_REG
> actually does in the hardware, I can only make guesses.

8<----------------------------------------------
   Selects Burst mode in Video mode

   In Non-burst mode, RGB data area is filled with RGB data and Null
   packets, according to input bandwidth of RGB interface.
   
   In Burst mode, RGB data area is filled with RGB data only.

   0 = Non-burst mode
   1 = Burst mode
8<----------------------------------------------

According the current implementation we are in Non-burst mode.

Regards,
  Marco

> Perhaps it does ditch the burst clock and switch the bit clock to be
> derived from the pixel clock of the upstream block, but that seems
> unlikely.
> 
>   Dave
> 



[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux