ADavidhazy <andpph@ritvax.isc.rit.edu> writes: > This inquiry was sent to List HQ who decided the subject merited discussion > online as there probably are many members who have a similar question: Here > goes: > > It is my understanding that Pixels are for viewing and Dots are for printing. > Is there any rule of thumb or some ratio that equates Pixels to Dots. For > example, if a printer wants a file to print an 8 X 11 image at 300 DPI than > how many PPI should the file contain? Mostly they're the same thing. And "PPI" is a better term. Because the area where they're *not* the same thing is with inkjet printers. Many inkjet printers will claim, and actually produce, 1200, 2400, even 2880 "DPI". However, each of those *dots* isn't a *pixel*; it can't reproduce all the possible colors. The printer driver uses sophisticated interpolation algorithms to figure out how to best use its possible *dots* to produce the *pixels* actually in the picture. The other area people get confused in is that pixels are only worth anything special if they're *camera original* pixels (or *scan original* pixels if the image originated on film). Blowing the file up just to meet a pixels-per-inch number is a waste of time and space. Then there's halftones, which also talk about dots and means something totally different. The rule-of-thumb is that you need either twice or 1.5 times the halftone frequency in your digital file, depending on who you ask. -- David Dyer-Bennet, <dd-b@dd-b.net>, <www.dd-b.net/dd-b/> RKBA: <noguns-nomoney.com> <www.dd-b.net/carry/> Photos: <dd-b.lighthunters.net> Snapshots: <www.dd-b.net/dd-b/SnapshotAlbum/> Dragaera mailing lists: <dragaera.info/>