Re: [RFC] drm: add overlays as first class KMS objects

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Jesse,

Discussion here in Budapest with v4l and embedded graphics folks was
extremely fruitful. A few quick things to take away - I'll try to dig
through all
the stuff I've learned more in-depth later (probably in a blog post or two):

- embedded graphics is insane. The output routing/blending/whatever
  currently shipping hw can do is crazy and kms as-is is nowhere near up
  to snuff to support this. We've discussed omap4 and a ti chip targeted at
  video surveillance as use cases. I'll post block diagrams and explanations
  some when later.
- we should immediately stop to call anything an overlay. It's a confusing
  concept that has a different meaning in every subsystem and for every hw
  manufacturer. More sensible names are dma fifo engines for things that slurp
  in planes and make them available to the display subsystem. Blend engines
  for blocks that take multiple input pipes and overlay/underlay/blend them
  together. Display subsytem/controller for the aggregate thing including
  encoders/resizers/outputs and especially the crazy routing network that
  connects everything.

Most of the discussion centered around clearing up the confusion and
reaching a mutual understanding between desktop graphics, embedded
graphics and v4l people. Two rough ideas emerged though:

1) Splitting the crtc object into two objects: crtc with associated output mode
(pixel clock, encoders/connectors) and dma engines (possibly multiple) that
feed it. omap 4 has essentially just 4 dma engines that can be freely assigned
to the available outputs, so a distinction between normal crtcs and overlay
engines just does not make sense. There's the major open question of where
to put the various attributes to set up the output pipeline. Also some of these
attributes might need to be changed atomicly together with pageflips on
a bunch of dma engines all associated with the same crtc on the next vsync,
e.g. output position of an overlaid video buffer.

2) The above should be good enough to support halfway sane chips like
omap4. But hw with more insane routing capabilities that can also use v4l
devices as sources (even video input connectors) media controller might be
a good fit. Media controller is designed to expose multimedia pipe routing
across different subsystem. But the first version, still marked experimental,
only got merged in .39. We discussed a few ideas as how to splice
media controller into kms but nothing clear emerged. So a possible kms
integration with media
controller is rather far away.

Yours, Daniel
-- 
Daniel Vetter
daniel.vetter@xxxxxxxx - +41 (0) 79 364 57 48 - http://blog.ffwll.ch
--
To unsubscribe from this list: send the line "unsubscribe linux-media" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html


[Index of Archives]     [Linux Input]     [Video for Linux]     [Gstreamer Embedded]     [Mplayer Users]     [Linux USB Devel]     [Linux Audio Users]     [Linux Kernel]     [Linux SCSI]     [Yosemite Backpacking]
  Powered by Linux