The old Raw vs JPEG: was Is a Batch of Photos ...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



> Hmm, 'JPGs are fine', that sort reminds me of what some on this list
> believe. Pity you weren't shooting RAW...


> Overall I found JPGs a poor substitue for a RAW file. No comparison
if
> it needed any tweaking, and definitely one stop short on the capture
> tonal range, even when the exposure was spot on.

Jim

I've changed the subject line because it ain't gonna help the original
question.

It's one of the ironies of the "digital revolution" that many of those
who have made the jump (leaving us film dinosaurs behind) now refuse
to accept, (in some cases almost religiously oppose)  the quality /
flexibility benefits of greater bit depths.  It's as if 8-bits good
... is the new mantra. It's as if it was so [8-bits per channel] for
some logical reason when in fact it was largely a historical
accident/processing convenience.  On line and off (last months EOS
magazine) you find statement after statement about how "there is no
difference".  8-bit shooters are pretty much in the same position
slide shooters were cf users of negative film in that sense.

The simple fact, read FACT, is that a real HDR workflow could do away
with almost all exposure worries at the time of capture.  If you want
shadow detail: it's there.  Ditto for highlight detail.

In terms of progress ... maybe it really is time for all manufacturers
to adopt Adobe's open source "Digital Negative" rather than their own
self-interest-generated "proprietary" raw formats.

Digital Negative (DNG) main page:
http://www.adobe.com/products/dng/main.html

Bob


[Index of Archives] [Share Photos] [Epson Inkjet] [Scanner List] [Gimp Users] [Gimp for Windows]

  Powered by Linux