But Bob, that IS my point!
Once you've done the fundamental stuff in whatever raw converting software takes your fancy, then the way to go forth is via 8-bit.
Qkano <wildimages@xxxxxxxxxxx> wrote:
Qkano <wildimages@xxxxxxxxxxx> wrote:
> But alas this is not so and I have found by experiment that it makes no difference if I convert my 12-bit Raw file to 8-bit or
16-bit, PROVIDED that in the end I convert the 16-bit to 8-bit anyway.
Sorry to repeat, but in a way you've skipped the absolutely key step.
That in converting from RAW to 8 bit you have already taken decisions (or is the conversion totally automated?) about what
information from the raw file to use. You've done most of the major adjustment work already. It's therefore no surprise that there
is no (sic) difference in a 16-bit and 8-bit workflow from that point on.
I know you're not just a wee snapper - and I agree that once a picture is nearly right (you've got the white and black points nailed
and the tones balanced across the range) there's probably little advantage - by then though the work's been done.
Then again: I do wonder (speculate) about the future.
Can we really see no difference or is that because we have not the 16-bit output hardware to compare it with? If it ever arrives
would anyone regret not having future proofed ...
I really don't know.
Bob
Herschel Mair
Head of the Department of Photography,
Head of the Department of Photography,
Higher College of Technology
Muscat
Sultanate of Oman
Muscat
Sultanate of Oman
Adobe Certified instructor
+ (986) 99899 673
Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2¢/min or less.