> But alas this is not so and I have found by experiment that it makes no difference if I convert my 12-bit Raw file to 8-bit or 16-bit, PROVIDED that in the end I convert the 16-bit to 8-bit anyway. Sorry to repeat, but in a way you've skipped the absolutely key step. That in converting from RAW to 8 bit you have already taken decisions (or is the conversion totally automated?) about what information from the raw file to use. You've done most of the major adjustment work already. It's therefore no surprise that there is no (sic) difference in a 16-bit and 8-bit workflow from that point on. I know you're not just a wee snapper - and I agree that once a picture is nearly right (you've got the white and black points nailed and the tones balanced across the range) there's probably little advantage - by then though the work's been done. Then again: I do wonder (speculate) about the future. Can we really see no difference or is that because we have not the 16-bit output hardware to compare it with? If it ever arrives would anyone regret not having future proofed ... I really don't know. Bob