Re: [PATCH 6/6] OMAPDSS: HDMI: Create platform device to support audio

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On 2012-10-23 03:48, Ricardo Neri wrote:

>>> +#if defined(CONFIG_OMAP4_DSS_HDMI_AUDIO)
>>> +#define HDMI_AUDIO_MEM_RESOURCE 0
>>> +#define HDMI_AUDIO_DMA_RESOURCE 1
>>
>> I don't see much point with these definitions. They are hdmi.c internal,
>> so the audio driver can't use them, and so they aren't really fixed.
> 
> I just thought it could make the code more readable; but if the
> resources array is going to be local, then they are not helpful.

My point was that if the defines as hdmi.c internal, you need to add the
same defines into the audio code also in order to use them. And then
we'd have the same defines in two places.

Or, if audio code doesn't need them to parse the resources, then they
aren't really relevant here either, as you are just adding two resources
to the array, and their order is not important.

>> So, how will this work? All the audio related functions will be removed
>> from the (video) hdmi driver, and the audio driver will access the
>> registers independently? The audio driver will still need to access the
>> video parts, right?
> That could be a new approach, but the idea here is to continue having an
> omapdss audio interface for audio drivers to use.

Ok. Do you have a git tree with the audio code working with this
approach? Or can you just copy paste a few lines showing how the audio
driver uses this. It'd be easier to understand by seeing that side of
the code also.

The audio uses sDMA for the transfer?

> The root problem that I am trying to address is that the omapdss audio
> interface does not have functionality for DMA transfer of audio samples
> to the HDMI IP. Also, I am not sure how that could be done without
> duplicating the functionality that ASoC already provides.

Ok. But the audio driver still needs access to the HDMI registers? I'm
not worried about passing the DMA resource. Video side doesn't use that.
But video side uses the registers, and both having the same ioremapped
area could possibly lead both writing to the same register. Or perhaps
not the same register, but still doing conflicting things at the hw
level at the same time.

>> I feel a bit uneasy about giving the same ioremapped register space to
>> two independent drivers... If we could split the registers to video and
>> audio parts, each driver only ioremapping their respective registers,
>> it'd be much better.
> 
> Fwiw, the audio drivers (at least my audio drivers) will not ioremap.
> They will just take the DMA request number and port. Maybe spliting the
> register space into audio and video is not practical as we would endup
> having many tiny address spaces.

Yes, if there's no clear HDMI block division for video and audio, then
it doesn't sound good to split them up if we'd have lots of small
address spaces.

What registers does the audio side need to access? Why are part of the
registers accessed via the hdmi driver API, and some directly? I imagine
it'd be better to do either one of those, but not both.

 Tomi


Attachment: signature.asc
Description: OpenPGP digital signature


[Index of Archives]     [Linux Arm (vger)]     [ARM Kernel]     [ARM MSM]     [Linux Tegra]     [Linux WPAN Networking]     [Linux Wireless Networking]     [Maemo Users]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite Trails]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux