I have some general questions about sound in linux; the interplay of pulse/(esd) and gstreamer,jackd and alsa. Let me state what I think I know. Alsa provides the drivers that allow sound to be produced and also the midi subsystem. You can use just the alsa system if you want to use audio or midi, but if you want to use (possibly) more than one program that uses sound eg the system sounds that gnome and kde make when alerts happen and say a music program, you need to use something like pulse or esd, or dmix in alsa. This is because the program captures the whole use of the sound card and can't release it for other programmes. If you want to use midi and audio together as in rosegarden, you need jackd because that allows more than one program to access the soundcard at a time. It is also useful because if you have the right kernel and privileges your audio programme can override the right of other programmes to access the kernel as well meaning that you don't get glitches when you are recording a masterpiece. Gstreamer fits in the normal esd/pulse picture by providing the access to codecs that files come in. Now this is where my understanding fails: gstreamer has a jack module (so I hear rumoured, but I can't find it in the ubuntu reps) and so does pulse. How do these fit in with the jackd stuff? Obviously they both make it so that jack can work with both or either of the sound servers. I welcome corrections in my interpretation of what's happening. Shelagh _______________________________________________ Linux-audio-user mailing list Linux-audio-user@xxxxxxxxxxxxxxxxxxxx http://lists.linuxaudio.org/mailman/listinfo/linux-audio-user