Re: DPI and perception question

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Let's say you print the entire 20x24 inches (I assumed inches here, as we
are talking about DPI - Dots Per Inch. If you meant something else, cm par
example, you would have said so, eh?)

20*300 * 24*300 = 43,200,000 dots
Assume ink (dye, wax) mixing technology.
Assume each dot is assembled from a 32 bit (4 byte color) word.
This is then 43,200,000*4 = 172,800,000 bytes = 172.8 mb.

The math changes with the technology used.

==================================================

Practical resolution limit of the human eye is about 1 arcmin.

s = r * theata, so this means that the eye can resolve objects about r * (1/60)*(pi/180).

At one foot (12 in) then, the eye can resolve dots 12*(1/60)*(pi/180) = 0.0035" in size.
this is 286 dots per inch.

For a creamy smooth, unpixelated look, you DO NOT want to resolve the dots (pixels), so you should have greater than 286 dots per inch. Note: the further away you are from the image, the less DPI you need .

Am I right about this?

That took a lot out of me. I need a spiritual consultation with the Reverand Jack Daniels.

Regards,
Bob...
--------------------------------------------------------
"Life isn't like a box of chocolates . .
it's more like a jar of jalapenos.
What you do today, might burn your butt tomorrow."

----- Original Message ----- From: "Paul Weyn" <paul@xxxxxxxxxxxxx>

Hi Andy,
Not sure I understand the process exactly either but when I have a 35mm
negative scanned to be printed at a maximum 20x24 size, the file size
created by my lab is about 80meg. In my mind, that's huge; however, the
results are extremely good - no pixilation at all. However, I do not
understand the math because if the printer can only go as high as 300 DPI
then why is so large a file needed...are the rest of the pixels just
discarded in the process, and if so then why wouldn't a smaller file work
just as well?

As an aside, the inks and papers are so good these days that compared to a
traditionally made print of the same image I can hardly tell the
difference.
And the advantage to printing digitally is the repeatability factor. Plus
it's really convenient for my lab to keep the file on record and for me to
just call in my order whenever I need one printed.

All considered, I still favor the older technology...there's an art and
science to it I really want to master. For the last year or so I've been
procuring darkroom equipment. As soon as I have enough to create my own

-----Original Message-----
From: owner-photoforum@xxxxxxxxxxxxxxxxxx

Hi,

I think it is customary practice for printers to "demand" image files at
300
dpi (whatever
that is) at final printed size of a reproduction. I guess this is to
reproduce images so
they have a high quality and don't look pixelated or something. (I think I
have
oversimplified things).

In any case, I was pondering whether one can get a fair idea of whether an
image file has
sufficient digital "resolution" so that when printed it will look "good"
by
looking at the
image at a larger size than what it will be reproduced at. So if I have a
5x5 cm image
file at 300 dpi but I look at it on my CRT or LCD screen at 200% or 300%
or
600% or more
magnification and at 300% the image on my screen looks OK ... but at 600%
it
starts to
fall apart ... is that an indication of anything?

Hope I have not been to obfuscating in this question ... drinking a
Snapple
only.


[Index of Archives] [Share Photos] [Epson Inkjet] [Scanner List] [Gimp Users] [Gimp for Windows]

  Powered by Linux