David writes: : Oh, I know people are still pushing that line, but it's pretty outdated : now so far as I can tell. You pretty much can't *buy* CRTs (except for : at really absurd prices -- and if you're up for the absurd prices, you : can now buy LCDs that have a bigger color gamut than any CRT; in fact, : that display a bit MORE than the full Adobe RGB space). And they still : take only 8-bit data. (Some clever people set up to use LED : illumination behind the LCD, and then used colored LEDs and matched the : LCD filter frequencies to the LED output and ended up with huge gamut. : NEC I know, possibly others use this too.) SED monitors will be even better than CRT's when they come out - but they'll still only be fed 8 bit colour from a video card unless it's a Matrox. 10 bit colour is ten bit colour. Irrespective of the gamut, 256 shades V 1024 shades, that's 4 times the gradient Sure, for the average user it is considered no big deal. It's a bit like the average music enthusiast doesn't see a difference between records and MP3s - especially when they have a nice thump from their subwoofers and it sounds good in their car (!) An audio perfectionist easily perceives the difference through their high end audio gear. Asks: 16 bit RAW, TIF or whatever - how do people manage them when they only see 8 bits? http://forums.storagereview.net/index.php?act=ST&f=2&t=9565 "In "official" oscilloscope tests conducted by some government labs here, Matrox is still #1, " gotta get my hands on that white paper.. : And yeah, the most aggressive games cards do tend to trade off color : fidelity for frame rate, probably a good choice for their intended : market but a disaster for us. Although I'm not sure if they do that : when displaying simple bitmap data rather than rendering 3D on the card? the tests still show a difference. it has been made an outdated concept by the avalanche of lesser 2D card hype, but the difference is their for the eye to see. karl