>... then again, since 8-bits can capture all meaningful levels of luminance >information about the world I don't know why anyone would want more, let >alone the floating point TIFFs used by HDR shop (one of the best freeware >games I have ever downloaded) Quoting from: http://www.anyhere.com/gward/hdrenc/hdr_encodings.html "We stand on the threshold of a new era in digital imaging, when image files will encode the color gamut and dynamic range of the original scene, rather than the limited subspace that can be conveniently displayed with 20 year-old monitor technology. " And later in the articlce: "Most image encodings fall into a class we call output referred standards, meaning they employ a color space corresponding to a particular output device, rather than the original scene they are meant to represent. The advantage of such a standard is that it does not require any manipulation prior to display on a targetted device, and it does not *waste* resources on colors that are out of this device gamut. Conversely, the disadvantage of such a standard is that it cannot represent colors that may be displayable on some other output device, or may be useful in image processing operations along the way." "A scene referred standard follows a different philosophy, which is to represent the original, captured scene values as closely as possible. Display on a particular output device then requires some method for mapping the pixels to the device's gamut. This operation is referred to as tone mapping, and may be as simple as clamping RGB values to a 0-1 range, or something more sophisticated, like compressing the dynamic range or simulating human visual abilities and disabilities. The chief advantage gained by moving tone mapping to the image decoding and display stage is that we can produce correct output for any display device, now and in the future. Also, we have the freedom to apply complex image operations without suffering losses due to a presumed range of values." IMO: It is ironic: many of the same people who have "jumped on the digital bandwagon" / "realised the benefits digital confers" [delete as appropriate] and are fast to dismiss film as "history" now stand as Luddites themselves in the holy war against further change. 8 bits per channel, hardware specific colorspaces ... why do we need firther progress ? ;o) IMO (again) surely the holy grail in imaging is indeed to code the full range of light intensities in the original scene (as the image data) and have translations to current hardware (limitations) as an overlayer. Current 8-bit-per-channel-stuck-with-it-for-ever dogma force compromises to be made at the processing stage (RAW file users) or taking stage (jpeg users). Computer memory is now incredibly cheap: the question should not be to justify using higher resolution encodings for image storage but to justify NOT adopting them. Most current hardware [note the word current] is stuck in the past and can do little with more than 8-bits anyway. But at some point things will give. Bob __________________________________________________________________ Get Tiscali Broadband From £15:99 http://www.tiscali.co.uk/products/broadbandhome/