On 1/25/23 21:14, Wesley Cheng wrote: > The QC ADSP is able to support USB playback endpoints, so that the main > application processor can be placed into lower CPU power modes. This adds > the required AFE port configurations and port start command to start an > audio session. > > Specifically, the QC ADSP can support all potential endpoints that are > exposed by the audio data interface. This includes, feedback endpoints > (both implicit and explicit) as well as the isochronous (data) endpoints. > The size of audio samples sent per USB frame (microframe) will be adjusted > based on information received on the feedback endpoint. I think you meant "support all potential endpoint types" It's likely that some USB devices have more endpoints than what the DSP can handle, no? And that brings me back to the question: what is a port and the relationship between port/backend/endpoints? Sorry for being picky on terminology, but if I learned something in days in standardization it's that there shouldn't be any ambiguity on concepts, otherwise everyone is lost at some point. > static struct afe_port_map port_maps[AFE_PORT_MAX] = { > + [USB_RX] = { AFE_PORT_ID_USB_RX, USB_RX, 1, 1}, > [HDMI_RX] = { AFE_PORT_ID_MULTICHAN_HDMI_RX, HDMI_RX, 1, 1}, > [SLIMBUS_0_RX] = { AFE_PORT_ID_SLIMBUS_MULTI_CHAN_0_RX, > SLIMBUS_0_RX, 1, 1}, And if I look here a port seems to be a very specific AFE concept related to interface type? Do we even need to refer to a port in the USB parts?