Questions over DSI within DRM.

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi All

I'm trying to get DSI devices working reliably on the Raspberry Pi,
but I'm hitting a number of places where it isn't clear as to the
expected behaviour within DRM.

Power on state. Many devices want the DSI clock and/or data lanes in
LP-11 state when they are powered up. With the normal calling sequence
of:
- panel/bridge pre_enable calls from connector towards the encoder.
- encoder enable which also enables video.
- panel/bridge enable calls from encoder to connector.
there is no point at which the DSI tx is initialised but not
transmitting video. What DSI states are expected to be adopted at each
point?

On a similar theme, some devices want the clock lane in HS mode early
so they can use it in place of an external oscillator, but the data
lanes still in LP-11. There appears to be no way for the
display/bridge to signal this requirement or it be achieved.

host_transfer calls can supposedly be made at any time, however unless
MIPI_DSI_MSG_USE_LPM is set in the message then we're meant to send it
in high speed mode. If this is before a mode has been set, what
defines the link frequency parameters at this point? Adopting a random
default sounds like a good way to get undefined behaviour.

DSI burst mode needs to set the DSI link frequency independently of
the display mode. How is that meant to be configured? I would have
expected it to come from DT due to link frequency often being chosen
based on EMC restrictions, but I don't see such a thing in any
binding.

As a follow on, bridge devices can support burst mode (eg TI's
SN65DSI83 that's just been merged), so it needs to know the desired
panel timings for the output side of the bridge, but the DSI link
timings to set up the bridge's PLL. What's the correct way for
signalling that? drm_crtc_state->adjusted_mode vs
drm_crtc_state->mode? Except mode is userspace's request, not what has
been validated/updated by the panel/bridge.

vc4 has constraints that the DSI host interface is fed off an integer
divider from a typically 3GHz clock, so the host interface needs to
signal that burst mode is in use even if the panel/bridge doesn't need
to run in burst mode. (This does mean that displays that require a
very precise link frequency can not be supported).
It currently updates the adjusted_mode via drm_encoder_helper_funcs
mode_fixup, but is that the correct thing to do, or is there a better
solution?
I'd have expected the DSI tx to be responsible for configuring burst
mode parameters anyway, so the mechanism required would seem to be
just the normal approach for adopting burst mode if that is defined.

Some DSI host interfaces are implemented as bridges, others are
encoders. Pro's and con's of each? I suspect I'm just missing the
history here.

When it comes to the MIPI_DSI_MODE_* flags, which ones are mutually
exclusive, or are assumed based on others? Does a burst mode DSI sink
set both MIPI_DSI_MODE_VIDEO and MIPI_DSI_MODE_VIDEO_BURST, or just
the latter?
Presumably !MIPI_DSI_MODE_VIDEO signals the of use command mode for
conveying video. So looking at panel-ilitek-ili9881c where it sets
just MIPI_DSI_MODE_VIDEO_SYNC_PULSE means command mode video with sync
pulses? That sounds unlikely.

I have looked for any information that covers this, but failed to find
such, hence calling on all your expertise.

Many thanks for your time,
  Dave



[Index of Archives]     [Linux DRI Users]     [Linux Intel Graphics]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [XFree86]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [XFree86]
  Powered by Linux