Re: Why EDID is not trustworthy for DPI

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



On Wed, 2011-10-05 at 10:30 -0400, Adam Jackson wrote:
> On Tue, 2011-10-04 at 19:05 -0700, Adam Williamson wrote:
> 
> > 96dpi, however, is almost *never* correct, is it? So just taking a
> > hardcoded number that Microsoft happened to pick a decade ago is hardly
> > improving matters.
> 
> The X default used to be 72dpi.  Maybe it'll be something else in the
> future, and then I can get bitched at more for having changed it yet
> again by people still using a fundamentally unreliable API.

That does seem like the most likely fudge that'll happen, yes: we'll
probably wind up with three 'standard' DPIs (say 96, 200, and 300), all
hardware built to approximate one of these, and computers that only have
to guess which one is right.

> > It still seems to me that taking the EDID number if it seems reasonably
> > plausible and falling back to 96dpi otherwise is likely a better option.
> 
> I reiterate: X gives you the actual sizes (as best as we can guess) on
> the RANDR outputs.  The global "size" that we default to 96dpi is broken
> to rely on in any event, because X simply has no mechanism for updating
> it besides reconnecting to the display.

We started this thread off talking about GNOME, not X. I'm still
thinking about GNOME, not X, as the thing that effectively hardcodes
96dpi. It has the option to get the 'correct' (probably) DPI from X, but
chooses not to.

> > Your examples lean a lot on TVs and projectors, but are those really the
> > key use cases we have to consider? What about laptops and especially
> > tablets, whose resolutions are gradually moving upwards (in the laptop
> > case despite the underlying software problems, in the tablet case
> > because the underlying software doesn't have such a problem)? Is it
> > really a great idea, for instance, if we put Fedora 17 on a 1024x600, 7"
> > tablet and it comes up with zonking huge fonts all over the place?
> 
> I'm going to not mention the traditional monitors I've seen with bad
> EDID.  I'm going to not mention the laptops I've seen that report 0x0
> physical size, or something non-zero and fictitious.  I'm going to not
> mention the laptops where you simply don't get EDID, you get some subset
> buried in the video ROM, and you get to hope that it might have physical
> size encoded in it.  

You just did, sorry. ;) Hardware sucks. We know this. Fedora generally
takes the position that it's correct to engineer things properly and
regretfully explain that the hardware sucks when this causes problems,
not engineer hacks and bodges to account for broken hardware.

> I'm going to not mention that DPI is only
> approximately what you want anyway, and that you actually need to know
> dots per unit arc, which is a function of both display size and view
> distance.

Yeah, that's the fudgiest part, and why laptops can get away with going
as high as 150dpi (though I *do* quite frequently see people with
1366x768 or even 1600x900 laptops using them at 1024x768...headdesk). I
like to deal with that problem by not thinking about it too hard. ;)
Ironically, though, it's a reason not to worry about the TV case too
much, because TVs tend to have a sort of 'standard' dots per unit arc -
if you know that what you're dealing with is a TV you can make some
reasonably safe assumptions about how big you should paint stuff.

> I'm going to simply quote myself from another message in this thread:
> How people use this information is entirely not my concern.  My job is
> to get the pixels on the screen; it might be to try valiantly to tell
> you how big they are; it is not to decide if they're big enough.

Sure. I was not directing my message entirely at you personally, but at
the question of whether it's a good idea for GNOME to simply say '96dpi
is it'.

> I would caution you against thinking that there's some DPI revolution
> right around the corner.  That's the same fallacy that rages against the
> TV industry for "stalling" at 1080p.  Linear increases in DPI are
> quadratic increases in link bandwidth, and maxed-out single-link DVI
> (the source of the 1080p limit) is already a higher symbol rate than
> gigabit ethernet.

I actually think there is; TV is not 'stalled' at 1080p, there's clear
moves towards 4K (especially since, with 3D more or less tanking, it's
the next Great White Hope of TV manufacturers to get people to replace
their HDTVs). The 96dpi number has probably survived so long because it
happens to approximate what you get when you display 'HD' resolutions on
the size of monitor most people are happy to have on their desks - 720p
at 19-20", 1080p at 22-24" or so. Once 4K gets some market traction,
some marketing genius somewhere is going to realise there's money in
them thar hills - the first mover to sell a 4K, 22" monitor is going to
have a nice selling point that's easily understandable by consumers.
That's almost exactly 200dpi - my second 'magic density'. No-one's going
to get very far trying to sell people 45" desktop monitors - the kind of
size you'd need to get back down to ~100dpi at 4K resolutions. I don't
think the IC problem is going to hold anyone up for very long, there's a
new HDMI revision every Wednesday, seems like...
-- 
Adam Williamson
Fedora QA Community Monkey
IRC: adamw | Twitter: AdamW_Fedora | identi.ca: adamwfedora
http://www.happyassassin.net

-- 
devel mailing list
devel@xxxxxxxxxxxxxxxxxxxxxxx
https://admin.fedoraproject.org/mailman/listinfo/devel


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[Index of Archives]     [Fedora Announce]     [Fedora Kernel]     [Fedora Testing]     [Fedora Formulas]     [Fedora PHP Devel]     [Kernel Development]     [Fedora Legacy]     [Fedora Maintainers]     [Fedora Desktop]     [PAM]     [Red Hat Development]     [Gimp]     [Yosemite News]
  Powered by Linux