> There is a plugin for flash (libflashsupport) for this. flashplugin's > built in alsa support used to be buggy and would kill pulse - not sure > if it still is tho'. I dont understand why one needs any special libraries. FlashPlugin can already output sound to an alsa device.So if I add the pcm.!default { type pulse } to my asoundrc, and if pulse correctly implements the ALSA API, any application that correctly uses the API - including FlashPlugin - should be able to just transparently use the new 'default' device, am I wrong? If that's not the case, I claim something is seriously wrong with Linux audio system. > I suspect that someone will want to write a bluetooth headset plugin for > pulse that can detect when a bluetooth headset is activated (which would > essentially replace the need to have the bt mac addr in asoundrc. It > could be intelligent enough to be configured so that streams with a > certain regexp are automatically moved such that if e.g. skype or ekiga > is mid call and I just open my headset, by the time I fit to to my ears > pulse has already moved the stream across.... that is theoretically > possible but not implemented yet. Why would we need that? IMHO, the existing bluez-audio does it very well already. I hope I can just define a virtual ALSA device called 'headset' of type 'bluetooth' just like in my asoundrc above, and then Pulse should be able to - completely transparently - treat this device just like it was a hardware sound card. Can it do that? The automatic sending of Skype or Ekiga streams to the headset can already be done, can't it? I mean, documentation says that one can assign default audio sinks to individual applications? L.