On Wed, 2016-10-26 at 08:30 +0200, nicolas.mailhot@xxxxxxxxxxx wrote: > But, GTK core maintainers have always insisted those didn't exist > (just like they insisted on hardcoding 96 dpi, on the eve of Apple > showing the world it was arbitrary and obsolete). ...by releasing displays carefully tuned to look best at a precise integer multiple of 96dpi? Not really great support for your theory. Sure, it's arbitrary. Arbitrary doesn't necessarily mean 'bad'. The 96dpi consensus worked perfectly well: hardware manufacturers knew what sizes and resolutions to make their monitors, and font designers (and UI designers) knew that when they had to make a tricky decision about how to tweak something, they should favour whatever choice makes it look good at 96dpi. Which is really important when you're designing something as finicky at a font, at a resolution as low as 96dpi; the question of which point size you choose as the cutoff for rendering a simple line as 1 pixel wide or 2 pixels wide is extremely important, for e.g. I used to go for the the 'everything should be perfectly resolution independent!' argument, because it seems intellectually satisfying from some sort of theoretical engineering point of view, but I find the argument that it's not really the most *practical* way to do things pretty convincing. Even now, the consensus mostly survives; most hardware is designed to work best at 96dpi or an integer multiple thereof. Awkward things like 13" 1080p displays are still in a distinct minority. -- Adam Williamson Fedora QA Community Monkey IRC: adamw | Twitter: AdamW_Fedora | XMPP: adamw AT happyassassin . net http://www.happyassassin.net _______________________________________________ devel mailing list -- devel@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to devel-leave@xxxxxxxxxxxxxxxxxxxxxxx