No I dont. Becausee we are talking about the printers raster DOT.
On Sep 3, 2014 1:26 AM, "karl shah-jenner" <shahjen@xxxxxxxxxxxx> wrote:
Karl. The image is 360dpi which is divisable by the the printers max dpi.
You dont send a 2880 dpi image to the printer. Or 5760dpi.
I think you mean pixels per inch Randy. PPI. this is a common source of confusion and one people have been trying to stamp out for quite some time. A lot of the blame can be put down to Epson for using the wrong term (dpi) when they ask the user how many pixels they want per inch and use dpi to ask ... and making such a feature of how many dots per inch the printer uses - an often irrelevant claim and why I said before dpi was irrelevant. Epson or others may have then wished to change history by talking of Dots as opposed to dots, but seriously, try pronouncing DPI with an uppercase D and thena lowercase D. The term PPI, a vastly more accurate one hads been around long enough. the image elements after all are pixels, the dots are named, abbreviated from ink dots.
http://www.imagescience.com.au/kb/questions/31/The+difference+between+PPI+and+DPI "Dots Per Inch is an old printing term and has almost no place in modern digital imaging"
http://www.andrewdaceyphotography.com/articles/dpi/ "DPI is still used in some documents and software when PPI is really what they mean"
pixels are your image, the number of pixels per inch define the size of your image .. it's that simple. I don''t think even the most compulsive psych-case knows how many ink dots it makes up their image, that stuff is best left to machines.
Now ... 360 PPI, I understand your claim of fractional resolutions being optimal for printers, I've heard it before. I've chosen so far to accept it on face value as I have never run any tests. It doesn't conflict with my knowledge but I don't repeat it as I've no evidence for or against it. At one point at the college I had some litle Canon 6 colour s800 2400dpi printers to take the load off the Epsons and I know they were highly sought after as they could punch through a A4 print in about 20 seconds - vastly quicker than the Epsons. of course that was if the clod operating it had optimized their images. There were always a few that did try to send things to print at 2400ppi or in 16 bit with multiple layers who then whined about the printer taking 20 minutes to make a print. Not surprising when they send a 4Gb image to the poor thing. at 300 ppi they flew.
Yet I heard some discussion that the canon documentation mentioned 49 grades of density - which some suggested meant the printer was using a 7x7 halftone screen (!) - and they concluded 340 ppi was the optimum for such a printer. Who do you suppose is right - the 340 ppi theorist .. or the 2400/8 = 300 ppi theorist? no one ran speed tests.
I DO know however that from 150-300 pixels per inch cubed, most people cannot see discrete pixels and they all blend to make a continuous image, and that is why that figure was selected years back as an optimum of sorts. However even saying that, on modern printers I've seen 40 pixel per inch images look acceptable. Not good.. just acceptable from the perspective that no discrete pixels have been distinctly visible. The dreaded 'blocks' . From this I have assumed printer rips have vastly improved over the years and upsampling done in-printer are now miles and miles ahead of what they were back when we began saying 300 ppi was the number to aim for or you'll see blocks. It's not surprising really, every other bit of tech has had under-the-hood improvements to seamlessly improve the users experience without pointing out each individual trick used to accomplish it.
I also know that a lot of people do not use photoshop and merely send their images straight from the web browser or email to their printer and have no idea of either ppi or dpi.. they send their 800x600 and the printer asks how big they pick 6x4 or full page. No pixel size is knowingly defined (it IS, but it's hidden from the user).. and the printer rip gives them a pretty good job. at 80 pixels per inch full page. sure it looks fuzzy.. but no pixels are evident.
but lines per inch and dots per inch - print head stuff - that's a whole different thing when you're talking about whether the human eye can detect picture elements.. or missing elements. When a printer drops a line you certainly see it, and it becomes quite clear our eyes are more than capable of detecting and resolving at 1200 pixels per inch and higher at normal viewing distance!
Randy wrote: Quote: No a pixel is the smallest unit that can be programed. with a single number value.
I replied: not many coloured pixels can be programmed with a single number value - can they? (I'm still asking you)
Randy replied: The mix value of a color or its LAB coordinate define the color. If you want to make up your own terms fine
also "A inkjet works the exact same way as a Halftone screen"
unless they use dithering in place of halftone. Did you see the postcard article I linked? particularly the bit about dithering? I know what you mean about them being dots in patterns, I get the print head - but look close at the spraypaint examples, the inkjet and also the photographic grain examples - the regularity of a halftone screen and it's image defects is a lot less like most inkjet prints and more closely appeared to be something in between a line printer and the irregularity of spraypaint. Or put it this way, when people are tring to reproduce the halftone screen look, they have to go to software to get it -
The mix value of a color or its LAB coordinate define the color. If you
want to make up your own terms fine. A pixel can be singular and have one
element or an infinite amount of elements. If you print 8micron single
element prints at the dot pitch of highest dpi printer you would have a lot
of paper base.
yeah. one inkdot can be a pixel - remember, I'm quoting you - " a pixel is the smallest unit that can be programed. with a single number value" or it can be one whole black page soaked in ink. You can't make a statement defining something Randy then turn it around and say someone else is making up their own terms when they actually conceded your point and accept that by definition the 'smallest unit' can technically be termed a pixel even if it is a dot. Thats just simply argumentative.
k