On Tue, Feb 03, 2015 at 08:31:30PM +0100, Jean-Francois Moine wrote: > Mark Brown <broonie@xxxxxxxxxx> wrote: > > So how does the simple controller interact with a more complex one given > > that it's somehow picking some controller node to start from? > A way to solve this problem could be to create only one card builder. > This creation could be explicit (created by the first active audio > controller) or implicit by the audio subsystem on the first controller or > CODEC creation. > Then, the card builder could scan all the DT looking for the audio > ports and create one or more cards according to the graph > connectivity. How is this going to work with dynamically instantiated hardware like DT overlays? > > If you have a device with any sort of speaker or microphone, or any sort > > of external connector for interfacing with an external device like a > > headphone jack, then you have something that could be a widget. > I know what are the widgets and routes, I was just wondering why they > (especially the widgets) need to appear at the card level instead of > just being declared in the DAIs (from the platform or the DT). As previously and repeatedly discussed DAIs have no special place in a general audio system and we can't base the entire system off them. Which DAI should have the headphone jack connected to the analogue only headphone driver in my system (there may not even be a way to route digital audio to it)? How does this work for off-SoC audio hubs where there is a device with multiple DAIs connected to both one or more other digital devices and the analogue? Please go and research this if you're intending to work on generic bindings, it gets extremely repetitive to have to go over this again and again. We already have simple-card to provide a binding for trivial systems and don't want to end up with a never ending series of slightly more complicated bindings which each cover slightly different sets of systems in ways that users struggle to differentiate between. > And the same question may also be raised about the audio formats, clocks, > tdm's... Similar things here - which of the two or more devices on a digital audio link (yes, they're buses not point to point links) has the configuration and how do we stitch them together? How do we figure out when and how to do runtime reconfiguration of the clock tree (which is needed by some systems)? Again, please do some research on this. If you are trying to define generic device tree bindings it is really important that you understand what you are trying to model with those bindings.
Attachment:
signature.asc
Description: Digital signature