And no again -- two monitors properly calibrated with a hardware color-sensor are probably more alike that the first and last print from a stone lithograph print run, Hahahah ... yea, right David. In an ideal world .... The fact remains that today, *now* most (all but a vanishingly small) minority of monitors are nor "properly" calibrated. I've noted now you were sticking to a tiny, tiny subset of all digital images (that is, those which are considered art AND are printed to hardcopy). But where is the original for them? The print? Well, we know they fade faster than oil paints of silver prints of old? The digital file? Well, the sequence of bytes is not the whole image. The RGB values alone we all know don't tell the whole picture. Add on the profiles (and the inextricably linked hardware) and the calibration hardware [if the definition of "properly calibrating" a monitor changes what then the file?]. Will today's standards stand still, be supported backwardly? When all images are saved in 16-bit per channel (or even HDR) formats and the original has been translated ... there may not be 1:1 bit correspondance. I'm sure the differences would be tiny but the "original" set of bytes may not be usable in 20y time without translation? ??? I reckon the best idea is to destroy the digital file and let the single copy of the print become the de facto original, saves worrying ... Bob