On Tue, 2009-05-19 at 19:26 -0600, Christopher A. Williams wrote: > > It sounds like you're not really following the concept of DPI. I'm not > > sure how you could possibly see different DPI settings as "sharp" or > > "not sharp", that just isn't the effect of changing DPI at all. All it > > does is cause characters to be rendered larger (high DPI) or smaller > > (low DPI). > > Actually I do understand this quite well. > > > > Regardless of all of that, there should always be a way to tell X what > > > DPI you want anyway. Who said the manufacturer's "correct" setting is > > > the best for you, and that's assuming they use a standard way of > > > specifying that? > > > > OK, clearly you don't understand the concept. > > > > There's no such thing as a DPI that's 'best for you'. DPI means dots per > > inch. The correct DPI is a pure mathematical calculation based on the > > size of the display and the resolution in use. There is no room for > > subjectivity. > > I could go on for a while here. I understand the concept of DPI a lot > better than you attribute to me. Fair enough. If you do, that's fine I have nothing to add. However, the way your message was written didn't seem to imply a good understanding of the issue. There are clearly problems with the current practical implementation of resolution independence, and the cited use cases (long viewing distances etc) are some of them. That's (partly) why we don't have it already thus making everyone super happy. To answer the practical issues raised -as Felix said, it's certainly possible to configure the DPI at the X server level, but (again as he said) the way to do this varies depending on the driver in use, unfortunately. You used to be able to do it fairly definitively for any driver using /etc/X11/Xresources (there's an Xft.dpi setting in that file which is supposed to override the X server's DPI value), but this doesn't appear to work consistently any more, unfortunately. I think a bug report requesting a consistent place to override the automatically calculated (or just arbitrarily chosen) DPI setting for X would certainly be valid. When GNOME's not defaulting to 96 dpi, it automatically inherits X's setting, but if you override it via GNOME's font configuration dialog, it sets it in a private way and the changed setting applies only to GTK+ apps. I think KDE is the same way. It might be nice if this were all co-ordinated between X, GNOME and KDE so that you can choose a manual setting either directly in some config file, or the GNOME / KDE apps would just poke that config file. Then it'd be nice and consistent. But in the long run the issue isn't just going to go away, and arbitrarily defaulting to 96dpi on all displays isn't the answer. It's horrible for very high-resolution displays, which are already fairly easily available and will only become more so. -- Adam Williamson Fedora QA Community Monkey IRC: adamw | Fedora Talk: adamwill AT fedoraproject DOT org http://www.happyassassin.net -- fedora-test-list mailing list fedora-test-list@xxxxxxxxxx To unsubscribe: https://www.redhat.com/mailman/listinfo/fedora-test-list