Re: [RFC PATCH 02/14] ASoC: qcom: qdsp6: Introduce USB AFE port to q6dsp

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi Pierre,

On 1/4/2023 3:33 PM, Pierre-Louis Bossart wrote:


On 12/23/22 17:31, Wesley Cheng wrote:
The QC ADSP is able to support USB playback and capture, so that the
main application processor can be placed into lower CPU power modes.  This
adds the required AFE port configurations and port start command to start
an audio session.

It would be good to clarify what sort of endpoints can be supported. I
presume the SOF-synchronous case is handled, but how would you deal with
async endpoints with feedback (be it explicit or implicit)?


Sure, both types of feedback endpoints are expected to be supported by the audio DSP, as well as sync eps. We have the logic there to modify the audio sample size accordingly.

Note that it's very hard to make the decision not to support async
endpoints, there are quite a few devices that are exposed as async to
work around an obscure legacy issue on Windows but are really
sof-synchronous endpoints that never ask for any change of pace.

Signed-off-by: Wesley Cheng <quic_wcheng@xxxxxxxxxxx>
---
  .../sound/qcom,q6dsp-lpass-ports.h            |   1 +
  sound/soc/qcom/qdsp6/q6afe-dai.c              |  47 +++++
  sound/soc/qcom/qdsp6/q6afe.c                  | 183 ++++++++++++++++++
  sound/soc/qcom/qdsp6/q6afe.h                  |  46 ++++-
  sound/soc/qcom/qdsp6/q6dsp-lpass-ports.c      |  23 +++
  sound/soc/qcom/qdsp6/q6dsp-lpass-ports.h      |   1 +
  sound/soc/qcom/qdsp6/q6routing.c              |   8 +
  7 files changed, 308 insertions(+), 1 deletion(-)

diff --git a/include/dt-bindings/sound/qcom,q6dsp-lpass-ports.h b/include/dt-bindings/sound/qcom,q6dsp-lpass-ports.h
index 9f7c5103bc82..746bc462bb2e 100644
--- a/include/dt-bindings/sound/qcom,q6dsp-lpass-ports.h
+++ b/include/dt-bindings/sound/qcom,q6dsp-lpass-ports.h
@@ -131,6 +131,7 @@
  #define RX_CODEC_DMA_RX_7	126
  #define QUINARY_MI2S_RX		127
  #define QUINARY_MI2S_TX		128
+#define USB_RX				129

the commit message says the DSP can support Playback and capture, but
here there's capture only ...



Sorry I will update the commit message properly. At the moment we've only verified audio playback.

static const struct snd_soc_dapm_route q6afe_dapm_routes[] = {
+	{"USB Playback", NULL, "USB_RX"},

... but here RX means playback?

I am not sure I get the convention on directions and what is actually
supported?


The notation is based on the direction of which the audio data is sourced or pushed on to the DSP. So in playback, the DSP is receiving audio data to send, and capture, it is transmitting audio data received. (although we do not support capture yet)

+struct afe_param_id_usb_cfg {
+/* Minor version used for tracking USB audio device configuration.
+ * Supported values: AFE_API_MINOR_VERSION_USB_AUDIO_CONFIG
+ */
+	u32                  cfg_minor_version;
+/* Sampling rate of the port.
+ * Supported values:
+ * - AFE_PORT_SAMPLE_RATE_8K
+ * - AFE_PORT_SAMPLE_RATE_11025
+ * - AFE_PORT_SAMPLE_RATE_12K
+ * - AFE_PORT_SAMPLE_RATE_16K
+ * - AFE_PORT_SAMPLE_RATE_22050
+ * - AFE_PORT_SAMPLE_RATE_24K
+ * - AFE_PORT_SAMPLE_RATE_32K
+ * - AFE_PORT_SAMPLE_RATE_44P1K
+ * - AFE_PORT_SAMPLE_RATE_48K
+ * - AFE_PORT_SAMPLE_RATE_96K
+ * - AFE_PORT_SAMPLE_RATE_192K
+ */
+	u32                  sample_rate;
+/* Bit width of the sample.
+ * Supported values: 16, 24
+ */
+	u16                  bit_width;
+/* Number of channels.
+ * Supported values: 1 and 2

that aligns with my feedback on the cover letter, if you connect a
device that can support from than 2 channels should the DSP even expose
this DSP-optimized path?


My assumption is that I programmed the DAIs w/ PCM formats supported by the DSP, so I think the ASoC core should not allow userspace to choose that path if the hw params don't fit/match.

Oh and I forgot, what happens if there are multiple audio devices
connected, can the DSP deal with all of them? If not, how is this handled?


This is one topic that we were pretty open ended on. At least on our implementation, only one audio device can be supported at a time. We choose the latest device that was plugged in or discovered by the USB SND class driver.

+ */
+	u16                  num_channels;
+/* Data format supported by the USB. The supported value is
+ * 0 (#AFE_USB_AUDIO_DATA_FORMAT_LINEAR_PCM).
+ */
+	u16                  data_format;
+/* this field must be 0 */
+	u16                  reserved;
+/* device token of actual end USB aduio device */

typo: audio


Thanks

+	u32                  dev_token;
+/* endianness of this interface */
+	u32                   endian;

Is this a USB concept? I can't recall having seen any parts of the USB
audio class spec that the data can be big or little endian?


No, this is probably just something our audio DSP uses on the AFE commands that it receives.

+/* service interval */
+	u32                  service_interval;
+} __packed;

+int afe_port_send_usb_dev_param(struct q6afe_port *port, struct q6afe_usb_cfg *cfg)
+{
+	union afe_port_config *pcfg = &port->port_cfg;
+	struct afe_param_id_usb_audio_dev_params usb_dev;
+	struct afe_param_id_usb_audio_dev_lpcm_fmt lpcm_fmt;
+	struct afe_param_id_usb_audio_svc_interval svc_int;
+	int ret = 0;
+
+	if (!pcfg) {
+		pr_err("%s: Error, no configuration data\n", __func__);

can you use a dev_err() here and the rest of the code?


Sure.

Thanks
Wesley Cheng



[Index of Archives]     [ALSA User]     [Linux Audio Users]     [Pulse Audio]     [Kernel Archive]     [Asterisk PBX]     [Photo Sharing]     [Linux Sound]     [Video 4 Linux]     [Gimp]     [Yosemite News]

  Powered by Linux