peyre wrote: > I've heard that using 24-bit color isn't such a good idea, since--oh, what was it?--the video card has to dither the colors up from 16-bit or down from 32-bit, or something like that. So I figured it might make for better performance if I left it at 16-bit. Lies. All modern video card using 32-bit formats internally. Working with 16-bit they do 16<->32 conversion all the time. Of course this is all transparent and doesn't really affect performance for 2-D. 3D however will suffer greatly. Running in 16-bit will limit number of available formats which most programs won't like - and that's exactly what you seen with Wine. Unless you were reading this some-time 15 years ago then yeah, those cards didn't had that much memory and every byte counted. But they weren't GPUs either...