On Tue, 26 Apr 2011 11:01:30 +0100 Alan Cox <alan@xxxxxxxxxxxxxxxxxxx> wrote: > > A lot of older hardware had one overlay that could be sourced to any > > crtc, just not simultaneously. The tricky part is the formats and > > capabilities: alpha blending, color/chromakey, gamma correction, etc. > > Even the current crtc gamma stuff is somewhat lacking in in terms of > > what hardware is capable of (PWL vs. LUT, user defined conversion > > matrices, gamut remapping, etc.). > > Rather than re-inventing enough wheels to run a large truck would it not > make sense to make hardware sourced overlays Video4Linux objects in their > entirity so you can just say "attach V4L object A as overlay B" > > That would provide format definitions, provide control interfaces for > the surface (eg for overlays of cameras such as on some of the Intel > embedded and non PC devices), give you an existing well understood API. > > For some hardware you are going to need this integration anyway so that > you can do things like move a GEM object which is currently a DMA target > of a capture device (as well as to fence it). > > For a software surface you could either expose it as a V4L object that > is GEM or fb memory backed or at least use the same descriptions so that > the kernel has a consistent set of descriptions for formats and we don't > have user libraries doing adhoc format translation crap. > > A lot of capture hardware would map very nicely onto GEM objects I > suspect and if you want to merge live video into Wayland it seems a > logical path ? Thanks Alan, of course that's a good idea, I'll see about integrating the two. -- Jesse Barnes, Intel Open Source Technology Center _______________________________________________ dri-devel mailing list dri-devel@xxxxxxxxxxxxxxxxxxxxx http://lists.freedesktop.org/mailman/listinfo/dri-devel