> err:xrender:get_xrender_format_from_color_shifts No XRender format found! > fixme:d3d_caps:wined3d_guess_card No card selector available for GL vendor 3 and card vendor 8086. > fixme:win:EnumDisplayDevicesW ((null),0,0x129eee0,0x00000000), stub! > err:xrender:get_xrender_format_from_color_shifts No XRender format found! > fixme:d3d_caps:wined3d_guess_card No card selector available for GL vendor 3 and card vendor 8086. > fixme:win:EnumDisplayDevicesW ((null),0,0x129ef64,0x00000000), stub! > fixme:ddraw:DirectDrawEnumerateExA flags 0x00000007 not handled > fixme:d3d_caps:wined3d_guess_card No card selector available for GL vendor 3 and card vendor 8086. > fixme:win:EnumDisplayDevicesW ((null),0,0x12963c0,0x00000000), stub! > fixme:x11drv:X11DRV_desktop_SetCurrentMode Cannot change screen BPP from 32 to 16 Gothic I and II both run (nearly) flawlessly on Wine. The last of the above messages indicate that you're running X in 32-bit depth, and the game is attempting to use 16bpp. However, it looks like you're also using Intel graphics; this may not work with the Gothic games, even though they're somewhat old. I played through the first Gothic in Wine (from gog.com) on a Mac with a GMA 3100, but I'm not sure if the Intel Linux graphics drivers are comparable to those of OS X. It performed a bit slow-ish, but good enough for my taste. Ubuntu has a bit of information on this with a link to a forum post; note that this is a distro-specific post and you'll need to follow up with the Ubuntu folks if you have trouble. Not sure this specifically applies to 10.04 either, or will help/hurt your setup. https://help.ubuntu.com/community/Wine#Error:%20Cannot%20change%20screen%20BPP%20from%2032%20to%20XX Basically you need to add any resolutions you'd like to use with Gothic in 16bpp depth to your X config.