Hi Phillip, On 09/09/2014 10:40 AM, Philipp Zabel wrote: > >>>> I've also worked out what I think is a workable video pipeline graph for i.MX, >>>> suitable for defining the entities, pads, and links. Unfortunately I haven't >>>> been able to spend as much time as I'd like on it. >>> This is very interesting, do you have written this somewhere ? >> Yes, I'll try to find some time to create a pdf image. > I'd be very interested in this, too. I should have something to show tomorrow. > I have in the meantime started to > implement everything that has a source or destination selector in the > Frame Synchronization Unit (FSU) as media entity. I wonder which of > these parts should reasonably be unified into a single entity: > > CSI0 > CSI1 Yes, we need a CSI subdev/entity, and it can be instantiated twice for the two CSI ports. > SMFC0 > SMFC1 > SMFC2 > SMFC3 I don't really see the need for an SMFC entity. The SMFC control can be integrated into the CSI subdev. > IC preprocessor (input to VF and ENC, if I understood correctly) > IC viewfinder task (scaling, csc) > IC encoding task > IC post processing task I see either three different IC subdev entities (IC prpenc, IC prpvf, IC pp), or a single IC entity with three sink pads for each IC task. > IRT viewfinder task (rotation) > IRT encoding task > IRT post processing task well, the IRT is really just a submodule enable bit, I see no need for an IRT subdev, in fact IRT has already been folded into ipu-ic.c as a simple submodule enable/disable. Rotation support can be implemented as part of the IC entities. > VDIC (deinterlacing, combining) I am thinking VDIC support can be part of the IC prpvf entity (well, combining is not really on my radar, I haven't given that much thought). > (and probably some entry for DP/DC/DMFC for the direct > viewfinder path) Ugh, I've been ignoring that path as well. Freescale's BSP releases and sample code from their SDK's have no example code for the direct-to-DP/DC/DMFC camera viewfinder path, so given the quality of the imx TRM, this could be a challenge to implement. Have you gotten this path to work? > > I suppose the SMFC channels need to be separate because they can belong > to different pipelines (and each entity can only belong to one). I see the chosen SMFC channel as an internal decision by the CSI subdev. > The three IC task entities could probably be combined with their > corresponding IRT task entity somehow, but that would be at the cost of > not being able to tell the kernel whether to rotate before or after > scaling, which might be useful when handling chroma subsampled formats. I'm fairly sure IC rotation must always occur _after_ scaling. I.e. raw frames are first passed through IC prpenc/prpvf/pp for scaling/CSC, then EOF completion of that task is hardware linked to IRT. > > I have put my current state up here: > > git://git.pengutronix.de/git/pza/linux.git test/nitrogen6x-ipu-media > > So far I've captured video through the SMFC on a Nitrogen6X board with > OV5652 parallel camera with this. Thanks Phillip, I'll take a look! Sounds like a good place to start. I assume this is with the video mux entity and CSI driver? I.e. no IC entity support yet for scaling, CSC, or rotation. Steve -- To unsubscribe from this list: send the line "unsubscribe linux-media" in the body of a message to majordomo@xxxxxxxxxxxxxxx More majordomo info at http://vger.kernel.org/majordomo-info.html