That old jpg v. RAW argument again...

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Gary Lawton wrote:

A JPG file is an 8bit compressed TIFF file. However, JPG compression is "lossy" meaning that data is thrown away during the compression process.

The conversion back to TIFF will not recover data that was lost during the compression. Nor can it recover the 8bits of data that were discarded from the 16bit sensor reading.

It'll work, it just wont give you the best quality

Gary

Peeter Vissak wrote:

I don't have RAW in my pretty simple camera, but I've compared finest
JPG to TIFF that I've shot in a row and these shots have been identical.
Well, to my eye.
So I shoot JPG and batch convert (IrfanView!!!) to TIFF immediately and
work in TIFF later on.
Saves a lot of processing/writing time (this small thing is so uglily
slow!) and of course memory.

Peeter



Three points...
1 Can anyone actually see the difference in the final image between TIFF and JPG at maximum quality (and not repeatedly saved and recompressed)??
2 Can you actually see the difference between an 16 bit TIFF and an 8 bit TIFF??
3 And in any case, if the same image were shot simultaneously on Canon 20D, Nikon D100, Pentax *st, Sigma 10D, Kodak etc., in RAW format and converted to TIFF without any manipulation, I'll bet that these would almost definitely show distinct differences when printed....


Just as using different films gave differing results under as identical a set of conditions as possible!

So what DO we all mean by the "best quality"??

Howard


[Index of Archives] [Share Photos] [Epson Inkjet] [Scanner List] [Gimp Users] [Gimp for Windows]

  Powered by Linux