Hi ----- Original Message ----- > > > Hi, > > > > > Yes you want to use EGL here, I think we could probably put more code in > > > qemu > > > to help with this case. > > > > Sure, if anything is needed we'll get that sorted ;) > > > > I suspect spice-server needs access to the gl context helpers > > (dpy_gl_ctx_*) if it wants use opengl. > > > > I would prefer some more low level (like EGLDisplay, EGLContext or > EGLSurface), > I'll do some more digging, not clear what a possible interface should be, > surely having duplicate EGL initialization (Qemu and spice-server) does not > look > a good thing. > > In the meantime I was trying gstreamer-vaapi (thanks to Christophe) and > did some more digging. Gstreamer seems to suppose that a dmabuf is mmap-able. I double gst_vaapi_surface_new_from_egl_image() will be mmap'ed by vaapi*enc > About vaapi implementation looks like Intel if quite good but for other > cards is not that great. Particularly for Nvidia the plugin is an > adapter for vdpau which does decoding, not encoding. Encoding for > Nvidia is done with NvENC (which looks hard to install on Linux). Yep, a handful of encoding APIs to deal with. Hopefully GStreamer can abstract it away for us. Anyway that's the best place to do that in the stack imho. I think it would be better to ask all that gstreamer question on gst-devel ML. > Another problem I got (I should open a bug) is that Qemu with Virgl > wake up spice 40/50 times a seconds (calling spice_qxl_wakeup) for no > apparent reason. There are some gui timer bits that could be disabled iirc _______________________________________________ Spice-devel mailing list Spice-devel@xxxxxxxxxxxxxxxxxxxxx https://lists.freedesktop.org/mailman/listinfo/spice-devel