Re: ehci-sched.c uses wMaxPacketSize but should use actual isoc urb packetsize

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hi,

On 11/29/2009 11:11 PM, Alan Stern wrote:
On Sun, 29 Nov 2009, Hans de Goede wrote:

Normal perhaps, but it's very common for the packet sizes to vary.
For example, consider an audio stream running at the usual CD data
rate: 44100 samples per second.  That's 44.1 samples per frame, so one
out of every ten packets will have to be larger than the others.


Note I'm not talking about the size of the packets actually send by
the hardware, those can fluctuate pretty wildly (at least with webcams),
I'm talking about the buffer size allocated inside the urbs as submitted
from the driver to the core. And my proposal is to use that buffersize
(atleast the one of the initial urb) to calculate the (maximum)
bandwidth usage.

There's nothing that says the buffer size has to remain constant
either.  Besides, we can't use the buffer size to determine the
bandwidth allocation if the allocation is done at the time the
altsetting is installed.

For a driver to submits multiple urbs for the same isoc ep with different
buffer sizes would be a really strange thing todo IMHO, and as said
we can add a check for drivers which do and simply report an error then.

No, I still think this is a bad idea.  For example, a driver might
reduce the buffer size because it wants to send fewer packets.


Ok.

On the other hand, it has been suggested that new programming
interfaces be added to the core, allowing drivers to change the
interval and maxpacket values.  This would affect both the host's
copies of the descriptors and the bandwidth allocation.


Hmm, I don't really like this, but it could work. That would mean in case
of an OHCI host, that the driver would need to do something
that the webcam driver would need to do something to enforce ed_get()
re-calculating the bandwidth after it has changed the maxpacketsize.

The calculations would no longer be done in ed_get().  They would be
done in the USB core.

I could make the driver try to reserve bandwidth first, and then based
on what it manged to get set maxpacketsize.

Note what it currently does for webcams which do have alt settings, is simply
try to start the stream at the highest bandwidth alt setting, if that does
not work (fails with ENOSPC), try again at a lower alt-setting, rince repeat.

But I could make it use bandwidth reservation for this special case,
assuming using bandwidth reservation does not causes a call to ed_get(),
or I could reset the alt-setting each time I change maxpacketsize, so:
calculate new maxpacketsize
set alt 0
set alt 1
override maxpacketsize

This should also cause ed_get() to recalculate the load using the new
maxpacketsize.

You could do this now, and it might work with ohci-hcd.  But it
wouldn't affect uhci-hcd, and it would have to change later when the
bandwidth decisions are moved into the core.


Ok, may I place one feature request then, which could even be implemented
before this new scheduling code gets implemented, can we have a:
usb_set_interface_ex()

Which has an extra parameter, which is a callback function which can modify
the interface description for the chosen interface + alt-setting, before
it gets used / stored.

This could then in this case be used to change the reported wMaxPacketSize
for the isoc ep. And could in general be used to work around bugs in
devices with broken descriptors.

If this sounds like an acceptable solution I would be happy to take a shot
at writing a patch for this.

Regards,

Hans
--
To unsubscribe from this list: send the line "unsubscribe linux-usb" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Media]     [Linux Input]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [Old Linux USB Devel Archive]

  Powered by Linux