pcg@xxxxxxxx (2002-11-01 at 0136.10 +0100): > Just FYI (I have no specific goal with this mail ;): I met some guy from > Dreamworks ("Shrek") at the LWE in Frankfurt, and he told me that their > whole rendering infrastructure is 8 bit, including intermediate results > (so the whole of Shrek was done at 8 bits, with a later dynamic adjustment > of the results into the necessary range). I guess they work with linear data all the way. Just mainly cos I have been trying some tricks with a 3D app, and they went boom until I told the app to stop using gamma. > And finally he told me that the need for 16 bit and floating point is > there in many but not most cases, so one _can_ get along without it, at > leats for rendered scenes. But not for real and render at the same time, and not for bad tuned render either. I am reading and getting info about this, seems linear and high range is best, if not, you have to choose how do you damage, but you hardly avoid it. Cineon is 10 bit and non linear, digital photo cameras start to give RAW dumps with more than 8 bit, some places use 32 bit float already (I would have say Dreamworks would have too... or at least 16 int)... Why all this rant? The more info, the better, I am trying to write about all this, so users know what GIMP can do, and how to solve the problems (or get the less noticeable error), and coders can get info about desired usage. GSR