Re: Forcepad interface design proposal

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Hey!

I just have a few notes on top of what Benjamin said.

On Wed, Apr 10, 2019 at 06:51:45PM +0200, Benjamin Tissoires wrote:
> Hi Sean,
> 
> On Wed, Apr 10, 2019 at 1:29 AM Sean O'Brien <seobrien@xxxxxxxxxxxx> wrote:
> >
> > Hello,
> >
> > I'm currently working on designing an interface for controlling "forcepads";
> > that is, touchpads with force sensors and haptic actuators. Below is my
> > proposal for the protocol at both the userspace and HID interfaces. I would
> > appreciate any feedback you might have.
> 
> Thanks a lot for starting this discussion. This is indeed an
> interesting topic, and I had a lengthy chat with Peter this morning :)
> 
> First, I'd like to get a small clarification.
> In the proposal, you are making reference to both the HID side (so the
> hardware protocol) and the kernel side (the client protocol).
> I read it that you are trying to build two things:
> - the HID design proposal, aimed at hardware makers
> - the kernel behavior on how to use it
> 
> If that is the case, I am glad that you can put some pressure to force
> a sensible protocol to hardware makers :)
> 
> Though, I think we should make a clear separation between the 2. We
> need to think of it as a whole, but the document should clearly put
> the boundaries IMO.
> 
> >
> > Thank you,
> >
> > Sean O'Brien
> > Chromium OS Touch/Input Team
> >
> > Background
> > ==========
> >
> > There are multiple independent projects to develop a touchpad with force sensors
> > and haptic actuators, instead of a traditional button. These “forcepads” have
> > several advantages and potential uses; they allow clicking across the entire
> > touchpad surface, adjusting the force requirement for clicks, haptic feedback
> > initiated by UI, etc.
> >
> > Objective
> > =========
> >
> > Develop a standard protocol to allow userspace applications to communicate with
> > forcepads, and minimize duplicated code and effort.
> >
> > Requirements:
> > 1. Support UI-initiated haptic feedback.
> > 2. Allow userspace to control when button press and button release haptic
> >    effects are triggered. (Useful when detecting a false click, changing force
> >    thresholds, or sending context-dependent effects)
> > 3. Allow a backward-compatible mode where the forcepad produces button press and
> >    button release effects autonomously.
> > 4. Reveal force sensor readings to userspace applications.
> >
> > Proposal
> > ========
> >
> > I propose standardized forcepad support in the linux kernel at both the HID and
> > userspace interface.
> >
> > Userspace Interface
> > -------------------
> >
> > Multitouch
> > ..........
> >
> > The linux kernel has a well defined protocol [0] for multitouch devices,
> > including touchpads. The protocol defines a field called ABS_MT_PRESSURE as a
> > measure of the pressure applied by a contact. Unfortunately, it is not used
> > that way in practice- instead it is generally used as an approximate measure of
> > contact area. I will distinguish these concepts by calling them “true force” and
> > “traditional pressure”.
> 
> Well, yes and now. It depends on the underlying technology.
> For capacitive sensors, "traditional pressure" if reported through
> ABS_MT_PRESSURE is the surface contact area. This is a rough
> equivalent to a pressure, assuming we all have somehow flexible
> fingers, and roughly the same shape. But for resistive touchscreens,
> IIRC we have already access to the "true force".
> 
> IIRC, when both ABS_MT_MAJOR and ABS_MT_MINOR are reported, libinput
> ignores ABS_MT_PRESSURE. So in a sense, there is no need to add a new
> ABS event, this one works quite well.

just as an extra note here: the kernel doc for ABS_MT_PRESSURE says "The
pressure, in arbitrary units, on the contact area", so that's what we do in
libinput (and the old synaptics driver) at least. We have arbitrary
thresholds but they are by device or device series. So *how* the pressure
was generated by the hardware itself is already meaningless because all it
gives us is some continuous data without useful reference points.
IOW, using ABS_MT_PRESSURE will work just fine.

And we can't just assume "if resolution is set, units are $foo" because
nothing written in the last decade or so will assume that. Some extra flag
is needed, like INPUT_PROP_FORCEPAD.

> >
> > Instead of using ABS_MT_PRESSURE to report the “traditional pressure,” forcepads
> > would send “true force” values in this field. For each contact, this field would
> > report the estimated force applied by that contact.
> >
> > The resolution of ABS_MT_PRESSURE should also be defined and reported, so that
> 
> Please define here what units the protocol expects. And please make it
> international (g/m2, N/m2 or whatever). Though Newtons might be
> difficult to grasp for people.

> > userspace consumers can translate to force units. By defining the resolution, we
> > also differentiate it from how it is used to report “traditional pressure”,
> > where it has no resolution. Userspace consumers will be able to use this to
> > detect that this is a forcepad, and treat the pressure field accordingly.
> 
> I tend to disagree. It is best in that case to explicitly mark the
> forcepad as such by adding a new INPUT_PROP. INPUT_PROP_FORCEPAD would
> be nice.
> 
> The reason is you don't know for sure that the reported unit is
> correct for the existing devices. If you want this to actually matter,
> then you can enforce the unit to be correct for a certified device,
> and then only export INPUT_PROP_FORCEPAD for those devices (thinking
> at the Win8 certification blob for multitouch screens/touchpads).
> 
> Note that defining what an INPUT_PROP_FORCEPAD is would be nice. For
> example does the user controlled haptic feedback be a requirement.
> 
> I would suggested:
> - can differentiate between at least 5 fingers
> - correct resolution for the X/Y (units and value)
> - report correct force per touch, and correct units for them
> - follows the MT protocol type B
> 
> So for hardware vendors, we would require to follow the MS spec for
> input devices in Win8 and Win8.1, and in addition support the Simple
> haptic controller HID table, and report correct units for the
> pressure.
> 
> >
> > ABS_PRESSURE may be optionally reported as the total force applied to the
> > forcepad.
> >
> > The device/driver shouldn’t detect button clicks, this is left to the userspace
> > gesture library. Accordingly, the driver should not sent BTN_* events to
> > userspace in normal operating mode. However it should still report the ability
> > to produce such events, for use in autonomous mode.
> 
> For backward compatibility, and to be able to debug it properly, you
> should keep the BTN_* events emulated in all cases.
> The userspace can ignore the events it doesn't want this way, but you
> will be able to debug the btn emulations on your current session
> without having to kill your compositor.
> There shouldn't be much of a head over forwarding those events, as it
> will never come alone, and will always be with an other one at least
> (pressure being 0 or less).
> 
> Also, not sending BTN_TOUCH and BTN_LEFT might give some headaches to
> legacy applications.
> 
> >
> > Haptic Control
> > ..............
> >
> > The force feedback protocol [1] should be used to request predefined effects.
> 
> s/request/control/
> 
> >
> > Typical use of the force feedback protocol requires loading effects to the
> > driver by describing the output waveform, and then requesting those effects
> > using an ID provided by the driver. We don’t want to describe the output
> > waveform explicitly, but instead use a set of predefined IDs for the desired
> > effect types. The device/driver would be responsible for having the effects
> > loaded and ready to respond to a request for the predefined IDs.
> 
> Re-reading through this made my head a little bit clearer.
> 
> I think I am starting to see what you are saying, but the proposal
> would need to be more precised here.
> 
> >
> > The force feedback protocol will need to be extended to allow requests for
> > predefined IDs. This requires a new feedback effect type:
> >
> >     /**
> >      * struct ff_predefined_effect
> >      * @level: strength of the effect
> >      * @vendor: ID of the vendor who defined the effect
> >      * @id: ID of the effect among those defined by the vendor
> >      */
> >     struct ff_predefined_effect {
> >             __s16 level;
> >             __u16 vendor;
> >             __u16 id;
> >     }
> >
> > Vendors can define specifications for the waveforms and assign them IDs. They
> > could then be requested using their vendor ID and the waveform ID, as defined in
> > the simple haptic HID protocol [2].
> >
> > To allow a standard way to trigger press and release effects, all forcepads
> > should support the WAVEFORM_PRESS and WAVEFORM_RELEASE through this interface.
> > Since the standard waveform id namespace doesn’t overlap with the vendor
> > waveform id namespace, the vendor id can be ignored for these waveforms.
> 
> First of all, I think you should split the HID (device/firmware)
> requirements from how they are used in the kernel.
> 
> IMO, it's fine to say that only touchpads exporting the "simple haptic
> HID protocol [2]" will be supported by this proposal.
> 
> And FWIW, the Microsoft Surface Dial already exposes this HID
> collection, and can be used to toy around the proposal (just saying).
> 
> If you intend to be more generic, we should not refer to this HID
> collection but explain how things are going to be in shape in the
> kernel.
> 
> That being said, there are a few additions/changes I'd like to see.
> 
> The way the input_ff protocol works is:
> - userspace upload an effect through EVIOCSFF and stores it in an id
> - this effect is supposedly stored into the device itself
> - userspace can then play the effect whenever it wants by calling the
> id in an input_event written to the evdev node
> - eventually, userspace can remove the effect with EVIOCRMFF
> 
> This doesn't entirely matches the simple haptic HID protocol.
> 
> As I read the documentation, the simple haptic HID only defines a set
> of waveform types:
> WAVEFORM_NONE, WAVEFORM_STOP, WAVEFORM_CLICK,
> WAVEFORM_BUZZ_CONTINUOUS, WAVEFORM_RUMBLE_CONTINUOUS, WAVEFORM_PRESS,
> and WAVEFORM_RELEASE. It also defines a "vendor" range where vendors
> can put any waveform they see fit and define in their haptic device.
> 
> So, I think that's why you want to augment the ff protocol by adding
> the struct ff_predefined_effect.
> 
> After thinking more about that, you are correct, we can not express
> those "simple haptic HID" waveforms in the current ff protocol. But I
> think the new struct should be focused on HID, not vendors, and
> contains the whole effect settings. All the current struct ff_*_effect
> also have the period, the magnitude, the offset, etc...
> 
> Re-reading through the HID HUTRR, I think we should add Intensity,
> Repeat Count, and Duration.
> 
> So I would prefer to have a more generic ff_effect:
> struct ff_hid_effect {
>     __u16 hid_usage;
>     __s16 intensity;
>     __u16 repeat_count;
>     __u16 duration;
> }
> 
> The id is already part of struct ff_effect, so this should allow us to
> "upload" a predefined hid effect.
> 
> Note that as I read it, we do not want auto-triggered haptic feedback
> here. If we do, we would nee to add an other field to the
> ff_hid_effect struct.
> 
> 
> The next thing I'd like to get some clarifications is how these
> effects are used. As I read it, userspace "uploads" the effect, and
> uses it, but the HID HUTRR doesn't say that the effects and parameters
> are stored in the device itself. It is for the auto triggered ones,
> but given that we want to opt for the manual trigger, we should store
> it ourselves.
> 
> So, the solution we came to this morning, while talking to Peter, was
> that the HID driver for a simple haptic HID device would allocate a
> virtual device memory to store the effects and the parameters.
> 
> This way, we can:
> - upload effect WAVEFORM_RELEASE with its parameters in id 0 of the
> drvdata of the device
> - upload effect WAVEFORM_PRESS with its parameters in id 1 of the
> drvdata of the device
> - ...
> - upload effect WAVEFORM_VENDOR_ZZZ_ZZZ with its parameters in id N of
> the drvdata of the device -> userspace will use it while scrolling for
> instance
> - ...
> 
> Then the kernel on BTN_LEFT press can automatically trigger the effect
> with id 1 and the one with id 0 on release in the case of the
> autonomous mode mentioned below.
> 
> To solve the question of knowing which effect should be loaded in
> which slot, I think we should rely on a userspace helper (udev?).
> We definitively not want the kernel to keep a list of devices to
> effects matches, but having a udev database (hwdb and intrinsic?)
> would nicely solve the issue as we do not need to update the kernel
> for each new device coming in.
> 
> From the kernel driver, we can populate the WAVEFORM_PRESS and
> WAVEFORM_RELEASE with some sensible parameters, but userspace should
> be allowed to override them.
> 
> The advantage of having this virtual memory of device effects, is that
> each userspace implementation could use its own matching for effects.
> For example, libinput might want to say:
> - id 0 -> BTN_LEFT released
> - id 1 -> BTN_LEFT pressed
> - id 0x1000 -> scrolling up
> - id 0x1001 -> scrolling down
> - id 0x2042 -> hard press
> 
> But chromeOS might use different ids and different meaning for them.
> The first 2 ids first should be reserved for the kernel button left
> emulation though.
> 
> Note that given that the effect memory is 'virtual' in the driver, we
> can use any id and we are not limited to a few numbers of them.
> 
> An other thing is that we do not need to require WAVEFORM_PRESS and
> WAVEFORM_RELEASE. If they are set in the haptic device, the kernel can
> set them for us, but if they are not, userspace can always decide to
> put one of the other waveforms in ids 0 and 1.
> 
> We should mention that not defining these two waveforms will induce a
> state were no haptic feedback will be automatically loaded from the
> kernel though (userspace will have to do it).
> 
> >
> > Autonomous mode
> > ---------------
> >
> > In order to facilitate an operating system which cannot handle force feedback,
> > the forcepad should start up in “autonomous mode”, meaning it acts as a normal
> > touchpad. This means it should perform the press and release haptic feedback
> > autonomously at predefined force thresholds, and send the appropriate BTN_*
> > events.
> 
> I think that if we were to follow my idea of using effect ids 0 and 1
> for release/press, we should mention that the kernel will play the
> effects stored as these ids.
> 
> >
> > After verifying that all of the required haptic effects are available through
> > the force feedback protocol, the OS can enable host-controlled mode. This could
> > be done by writing to a sysfs node “host_feedback_enabled”.
> 
> You don't really want a sysfs file here. Because if the input part
> generating the haptic feedback crashes, or if you are VT switching and
> your evdev node gets revoked, you are left in a system without haptic
> feedback for clicks.

Extra note here: compositors don't have write access to sysfs. some do, but
the default assumption is that they don't.

> So, our idea was to use a new couple of ioctls, EVIOCFFTAKEMASTER and
> EVIOCFFRELEASEMASTER (or whatever is more politically admitted), which
> the kernel would ref count on each effect id.
> 
> So, if the OS decides to take control over the press autonomous haptic
> feedback, it will emit the ioctl EVIOCFFTAKEMASTER with the effect id
> 1. If an other client also wants to control this id, it can also take
> the control, and the total count of clients controlling this effect id
> is now 2. If a BTN_LEFT is emitted by the device (or emulated by the
> kernel), the driver will check the refcount of the effect id, sees
> that there is at least one, and will not forward the haptic effect to
> the device.
> On the contrary, in this case, effect id 0 is not handled by
> userspace, and the kernel will control it when a BTN_LEFT release is
> emitted.
> 
> Note that this ioctl will only have an impact on effect ids 0 and 1.

hmm, I don't think there's a technical reason for that limitation, is there?
simply specifying it as "any effect ID that does not have a current master
will not get auto-played by the kernel". Simple enough and leaves the actual
usage more flexible.

> The benefit of using an ioctl is that is the client closes the fd, or
> if the fd gets revoked, the kernel can decrement de usage count and
> eventually switch back to the autonomous mode described above.

I'll add a shorter summary for the normal case, just to make it easier to
comprehend the whole thing:
- the kernel has default effects for ids 0/1
- some process (udev) uploads device-specific waveforms where needed, for ids
  0/1 and others if need be
- the kernel always sends BTN_LEFT when some hardcoded threshold is met
- when sending BTN_LEFT, the kernel plays id 0/1 for release/press
- if a process has the EVIOCFFTAKEMASTER for a given id, the kernel does not
  play it and relies on that client to do so.

This approach means:
- default behaviour is predictable and useful
- updating for devices is in userspace 
- the entity controlling the click-buzz (e.g. libinput) doesn't need to be
  aware of the device-specific waveforms
- VT-switch/compositor change/suspend/... (i.e. close(fd)) restores default
  behaviour
- other processes can use the pad for haptic feedback without interfering or
  having to care about whether the session supports forcepads for button
  handling

> > When the host enters suspend mode, the OS will not be able to respond quickly
> > enough to input from the touchpad to tell it to perform haptic feedback, making
> > the touchpad feel unresponsive. When the host suspends, the touchpad should
> > enter autonomous mode.
> 
> This could be achieved in my proposal above by either closing the fd,
> or revoking it, so we would get this for free :)
> 
> >
> > HID Interface
> > -------------
> 
> So this is the part where we tell ODM what to do. Correct?
> 
> >
> > Multitouch
> > ..........
> >
> > The HID API for multitouch reports is mostly unchanged except:
> >
> > 1. The tip pressure field [3] should be used to report “true force” instead of
> >    “traditional pressure”. The physical unit type, exponent, and limits should
> >    be reported in the report descriptor for the “true force” field [4].
> 
> works for me
> 
> > 2. The device will always report it’s button as being unpressed, except in
> >    autonomous mode, when it will report the button state according to its
> >    predefined force thresholds.

fwiw, devices that report capabilities but never send those events are a
nightmare to deal with because it forces heuristics and guesswork in the
clients that can be wrong at any point. Filtering events because we pretend
to know better than the kernel driver is a lot easier.

Cheers,
   Peter

 
> I think I would prefer having a basic left button emulation that we
> can rely on when userspace doesn't know about forcepads.
> This would allow us to not have to emulate it in the kernel, but just
> forward the data as it comes in.
> 
> Also, this would allow the driver to not have to drive the haptic
> feedback in the autonomous mode: we just set the auto trigger on
> BTN_LEFT usage, and done.
> (if we do not enforce the auto triggering, then the driver will also
> have to manually send the haptic events on BTN_LEFT 0/1).
> 
> If we are giving requirements to ODM, I think we also need:
> 
> 3. a specific application usage that tells us that the device is a forcepad.
> Ideally, I would love to extend the HUT, but we might consider saying
> that a forcepad, to be exposed as such, needs to expose a "Simple
> Haptic Controller" logical collection alongside the Touch Pad one.
> 
> That's what the Surface dial is doing:
> - application collection: System Multi-Axis Controller
>   - logical collection: Puck
>   - logical collection: Simple Haptic Controller
>     - logical collection: Durations List
>     - logical collection: Waveforms List
> - endof application collection
> 
> So this would make clear what a HID forcepad is.
> 
> 4. a way to store in the HID device the pressure threshold the device
> is supposed to report the click.
> We can guess which threshold it is if the unit is correct, but I think
> I would rather have the device telling us this. Problem is, I can not
> find a good HID usage for that.
> 
> 5. If a device doesn't meet those criteria above, it will not get
> categorized as a FORCEPAD and will not get all the fancy FF effects
> from the OS.
> 
> This last point is very important as this will prevent us to have to
> deal with quirky devices.
> 
> Note that MS solved the issue by requiring ODM to actually test their
> devices, and then MS issues a blob in a vendor feature that is tested
> to enable or not the touchscreen / touchpad gestures.
> 
> >
> >
> > Haptic control
> > ..............
> >
> > The simple haptic HID protocol [2] should be used.
> 
> s/should/must/
> 
> >
> > The following waveforms should be supported:
> >
> > | WAVEFORM_NONE            | Implicit waveforms required by protocol           |
> > | WAVEFORM_STOP            |                                                   |
> > | ------------------------ | ------------------------------------------------- |
> > | WAVEFORM_PRESS           | To be used in autonomous mode and host-controlled |
> > | WAVEFORM_RELEASE         | mode to simulate button press and release.        |
> 
> I would not absolutely require those 2. If they are present, the
> kernel (driver) should set them as the default feedback (in the
> autonomous mode), but if they aren't, it's not a big deal, the
> userspace will have to quirk the device.
> 
> > | ------------------------ | ------------------------------------------------- |
> > | Vendor-defined waveforms | Optional waveforms to be used in host-controlled  |
> > |                          | mode, subject to vendor specification.            |
> 
> I think we should just say that the device is free to implement all
> the waveforms it wants as per HUTRR 63b.
> 
> >
> > All waveforms will have an associated duration; continuous waveforms will not be
> > supported.
> >
> > Only manual triggering will be supported through this interface. Autonomous
> > triggering of waveforms is enabled by putting the device in autonomous mode.
> 
> This part is ambiguous. The Auto triggering on the HUTRR 63b says that
> when the device emits a specific HID usage, it will also emit the
> matching haptic effect.
> While here, you want the kernel to driver the button press/release
> (unless I am reading it wrong).
> 
> Also, I think you are mixing here the HID requirements and the
> exported kernel API. I think there is nothing wrong to comply with the
> HUTRR 63b. We can use the auto-trigger functionality in the autonomous
> mode described above. The point is that we do not want to export that
> to the userspace.
> 
> >
> > Retriggering (queueing multiple triggers of the same waveform) is not supported.
> 
> This will simplify the new struct_hid_effect, but I am not sure there
> is a point to prevent the haptic device of having that capability.
> 
> >
> > If intensity modification for waveforms is supported by the device, the
> > intensity control should be included in the manual trigger output report. This
> > allows modification of the intensity on a per-waveform basis.
> 
> yep
> 
> So I think we should simply say that the haptic feedback needs to
> follow HUTRR 63b, with the following amendments:
> - continuous waveforms will be ignored by the host
> - Repeat Count will be ignored by the host
> - Auto mode trigger might get used for the autonomous mode described
> above, but can be optional
> 
> >
> > Alternatives Considered
> > =======================
> >
> > Add a “true force” field to the multi-touch protocol
> > ----------------------------------------------------
> >
> > This allows us to send “traditional pressure” in addition to “true force”. It
> > also allows another possible protocol in addition to sending the force per
> > contact:
> >
> > * Sending overall force and center of force: Should be easier to calculate
> >   depending on force sensor layout, and provides potentially useful extra info
> >
> > As mentioned before, there is already a concept of pressure in the multi-touch
> > protocol, generally used as a proxy for contact area. We would add another field
> > to represent force. However, the pressure field is defined by the protocol as
> > actual pressure, and I feel this would only add more confusion. The touch_major
> > and touch_minor fields can be used to report contact area explicitly, so we're
> > not losing that capability.
> 
> Nope, I am not buying this idea. As mentioned above, the MT_PRESSURE
> was always intended to report the pressure of the contact. It is an
> approximation in the capacitive sensor case, but there is no point in
> adding one new event. Also, if we set the INPUT_PROP_FORCEPAD, it is
> now clear that the unit and thus the resolution is reliable.
> 
> >
> > Report overall force, but not per-contact force
> > -----------------------------------------------
> >
> > Instead of reporting the “traditional pressure,” send “true force” values in the
> > ABS_PRESSURE field.The ABS_MT_PRESSURE field would still be used to send
> > “traditional pressure” for each contact.
> >
> > I'm not convinced this is necessary, or the best idea. It makes backward
> > compatibility easy, but other input libraries could detect forcepads by the
> > resolution associated with the ABS_MT_PRESSURE field, and ignore that field,
> > using touch_major and touch_minor instead.
> >
> > In addition, it adds even more confusion than the above option, so I think I'd
> > prefer a clean break.
> 
> I think I like having the ABS_PRESSURE as the total force applied. But
> we need per-contact force if this is available.
> And again, I don't think we should mention the "traditional pressure"
> as this is just a quirk from the capacitive sensors, and the actual
> pressure was reported before those sensors came mainstream (think of
> resistive ones).
> 
> >
> > Use driver-wide gain for force feedback
> > ---------------------------------------
> >
> > The force feedback protocol also has a mechanism to set driver-wide gain, which
> > could be used to set global effect strength level. However, allowing different
> > strength levels for each predefined effect would allow more flexibility e.g.: if
> > the user wants to have low strength for UI feedback effects and high strength
> > for button click effects.
> 
> I also think the best would be to have per effect intensity.
> 
> Anyway, thanks a lot for putting the effort in this.
> 
> The last bit I wanted to add is that we should define properly the
> haptic and touch protocol in a way that we can support non simple
> haptic HID controllers. For example, I doubt the Apple forcepad
> follows the HID spec, but if the driver can provide the same level of
> quality we require to mark a device as INPUT_PROP_FORCEPAD, we don't
> really care if the haptic interface is HID or not.
> 
> Cheers,
> Benjamin
> 
> >
> > [0]: https://www.kernel.org/doc/Documentation/input/multi-touch-protocol.txt
> > [1]: https://www.kernel.org/doc/Documentation/input/ff.txt
> > [2]: https://www.usb.org/sites/default/files/hutrr63b_-_haptics_page_redline_0.pdf
> > [3]: Usage ID 0x30 of HID usage table 0x0D. See chapter 16:
> >      https://www.usb.org/sites/default/files/documents/hut1_12v2.pdf
> > [4]: See section 6.2.2.7 of the HID specification:
> >      https://www.usb.org/sites/default/files/documents/hid1_11.pdf



[Index of Archives]     [Linux Media Devel]     [Linux USB Devel]     [Video for Linux]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]     [Linux Wireless Networking]     [Linux Omap]

  Powered by Linux