On Wed, Jul 8, 2015 at 8:58 AM, Steven Newbury <steve@xxxxxxxxxxxxxxx> wrote: > > > On Tue Jul 7 15:12:28 2015 GMT+0100, Alex Deucher wrote: >> On Tue, Jul 7, 2015 at 9:46 AM, Steven Newbury <steve@xxxxxxxxxxxxxxx> wrote: >> > >> > I've tried an xserver-1.16, and ddx, libdrm without LTO and with >> > gcc4.9. Exactly the same thing. I wondered whether the unused i810 >> > could be interfering but triggering a device "remove" before starting >> > X made no difference. >> > >> > I'm a bit of a loss. I suppose I could try writing a simple test for >> > drmSetInterfaceVersion(). At least that should determine whether the >> > xserver/ddx is in the clear. >> > >> > Any other ideas? >> > >> >> Can you start a non-X runlevel and start X manually as root (assuming >> you are using a login manager now)? >> > My test program worked fine. I considerably improved it over the version I posted. I'll send it to the list when I get back. > > I removed the drmSetInterfaceVersion() from radeon_kms.c and it got much further. Starting Xserver as root apparently started normally, according to the log, although there was a permission denied error on mode set during init. I don't know whether it was related or not, but the display then hung with a non-blinking cursor. Strange to get a permission denied as root! > > Starting GNOME via gdm gives a working slow X session but for some reason only uses sw dri even though the Xorg log shows r200 DRI2 as initialized. Perhaps it's a config error somewhere.. ? > > startx as a regular user just works! > > But mutter doesn't, perhaps that's > why a gnome session isn't working. It just gives the following error: > Cogl-ERROR **: Failed to create texture 2d due to size/format constraints > > Mutter is supposed to work on r200, right? IIRC it tries to use a render buffer format that's not supported by the hw. Alex _______________________________________________ dri-devel mailing list dri-devel@xxxxxxxxxxxxxxxxxxxxx http://lists.freedesktop.org/mailman/listinfo/dri-devel