Properties to suppress save/restore of stream volumes

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Mon, 2011-02-07 at 10:52 +0530, Arun Raghavan wrote:
> On Thu, 2011-02-03 at 19:19 +0200, Tanu Kaskinen wrote: 
> > On Thu, 2011-02-03 at 10:37 +0000, Colin Guthrie wrote:
> > > I guess the real problem would be how to interface with xfade in gst
> > > such that gst applications would be essentially ignorant of "how it's
> > > done", such that gst-on-alsa works as transparently (to the app) as
> > > gst-on-pulse.
> > 
> > I'm skeptical about the feasibility to do this fully transparently. If
> > the backend doesn't support cross-fading, transparency would probably
> > mean that alsasink (and pulsesink too if the daemon doesn't have the
> > support enabled) would have to implement cross-fading inside itself...
> > Or maybe the answer is a xfadesink, which is a bin that builds the
> > required pipeline behind the scenes if the final sink doesn't support
> > cross-fading? Arun probably has a better idea of this side...
> 
> I'm still trying to work out the best way to integrate this with the
> GStreamer folk, but as basic background. GStreamer has a Controller API
> that basically allows you to set a property (volume being a property
> here) from value 'va' at time 'ta' to value 'vb' at time 'tb' using a
> specified interpolation method for intermediate points (GStreamer
> supports a number of these).

After some thought/discussion, trying to map the GstController API to
what we want to do in PulseAudio seems to be a bad idea. The API is too
generic and we have some very specific use-cases in mind.

> > > I'm not sure how this would work in practice at the PA end, but I would
> > > guess some kind of protocol/API extension (as is done for m-s-r and
> > > m-d-m right now) rather than a part of the core protocol (but perhaps
> > > this would be justified?).
> > 
> > I would definitely do it in a module + an extension. A track-change
> > cross-fade would be possible without an extension API, though, and
> > probably also fade-in: just set properties
> > "module-xfade.previous-stream" and "module-xfade.duration" (optional).
> > No protocol extensions needed. previous-stream would be the index of the
> > stream to fade out (maybe a negative number for fade-in, ie. no previous
> > stream). The cross-fade processing would automatically start when that
> > property is detected by module-xfade on a new stream. Fade-out would
> > have to be signaled through some extension API still. I don't know if
> > fade-in and fade-out are in the requirements, though, or is cross-fade
> > the only thing that's needed?
> 
> I'd thought of this when I started doing the fades on pulsesink but was
> hoping I could get away with a less intrusive solution. Alas, it was not
> meant to be. :)
> 
> Since the application knows better than PA when a stream is meant to be
> faded (end of stream, start of playback, hitting next/previous,
> play/pause), I think the actual control of fading should be left there.
> On the PulseAudio side, we would extend the API with something like
> pa_context_set_sink_input_volume_with_ramping() which takes a target
> volume and fade-time (by now you're probably going "A-ha! There's
> already a pa_sink_input_set_volume_with_ramping()!" ;)). 

If there are no objections to this approach, we could expose the fading
API via a property (set it to trigger a fade) and a signal (for the
fade-done callback).

For the completion notification, I realised we would just pass the
callback to the _set_volume_with_ramping() function like the rest of the
API, so nothing new needs to be done for that it seems.

Now all that remains is to figure out whether this belongs in core or in
an extension. I think it fits in core (a _with_ramping variant of
existing core API), but I'm not a religious about it. :)

-- Arun




[Index of Archives]     [Linux Audio Users]     [AMD Graphics]     [Linux USB Devel]     [Linux Audio Users]     [Yosemite News]     [Linux Kernel]     [Linux SCSI]

  Powered by Linux