On Fri, May 28, 2004 at 01:37:46PM -0400, Ivica Ico Bukvic wrote: > Forgot to add that my assumption is (in addition to my previous statement) > if JACK was then running using reasonably small buffers the drift would be > then minimized if not alleviated since JACK is one that is dispatching the > buffers at appropriate time, right? But each card has its own internal hardware clock that determines the rate at which the DACs consume (and the ADCs produce) data. Software cannot change this. Unless the clocks are locked at the hardware level, one card will produce & consume data to/from jack's alsa IO layer faster than the other. Pretty soon you are screwed. If the cards *are* synced at the hardware level, it should work. This can be done either via word-clock connections if the cards support that, or by a quick hack: connect S/PDIF output on one card to S/PDIF input on the other. Set the second card to use its S/PDIF input as clock source. This hack assumes you have S/PDIF I/O and are willing to sacrifice it for synchronization. -- Paul Winkler http://www.slinkp.com