Currently the biggest possible sink latency is 10 seconds. The total latency of the loopback is divided evenly for the source, an intermediate buffer and the sink, so if I want to test 10 s sink latency, the total needs to be three times that, i.e. 30 seconds. --- src/modules/module-loopback.c | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/modules/module-loopback.c b/src/modules/module-loopback.c index e24ded5..b3b9557 100644 --- a/src/modules/module-loopback.c +++ b/src/modules/module-loopback.c @@ -811,7 +811,7 @@ int pa__init(pa_module *m) { channels_set = true; latency_msec = DEFAULT_LATENCY_MSEC; - if (pa_modargs_get_value_u32(ma, "latency_msec", &latency_msec) < 0 || latency_msec < 1 || latency_msec > 2000) { + if (pa_modargs_get_value_u32(ma, "latency_msec", &latency_msec) < 0 || latency_msec < 1 || latency_msec > 30000) { pa_log("Invalid latency specification"); goto fail; } -- 1.8.1.2