On Sat, 2006-04-22 at 10:52 +0200, Erwin Rol wrote: > On Sat, 2006-04-22 at 17:05 +0930, n0dalus wrote: > > On 4/21/06, Erwin Rol <mailinglists@xxxxxxxxxxxx> wrote: > > > when i switch from 24bit to 16bit mode (I can't test 32bit since that > > > doesn't seem to be supported with the radeon driver). > > > > Isn't 32 bit colour depth just the same as 24 bit with an unused 8 bits? > > Well the driver must think there is some difference because it bails out > with a "not supported" when i select 32bit color depth :-) I dunno if > the radeon actually does a 24bit packed (3 byte) or a 24bit unpacked > (4byte) store in the frame buffer. Also the Matrox cards support this > 10bit per color thing, so they need 30bit per pixel, there 24bit and > 32bit really would be different. For a while now, in XFree86/Xorg, setting depth 24 refers to the actual color depth, not padding. The driver is to set the actual framebuffer depth to 24 or 32 bits as is appropriate for the hardware. AFAIK on most hardware, padding 24bit out to 32 bit performs faster. No idea how Xorg plans to handle this new fangled HDR thing...
Attachment:
signature.asc
Description: This is a digitally signed message part
-- fedora-devel-list mailing list fedora-devel-list@xxxxxxxxxx https://www.redhat.com/mailman/listinfo/fedora-devel-list