On Sun, 2024-11-17 at 12:09 +0100, lejeczek via users wrote: > Now I have two "identical" Dell monitors - P2418D - and colors are > different, one has whites cooler whereas the second's whites (thus > other colors) are warmer. That was the bane of anyone involved in video production since colour TV came out. Every monitor looked different thanks to manufacturing tolerances, and how they were individually set up. It led to many studio control rooms only using monochrome monitors for all camera preview screens, to stop it being a major distraction. Only preview and program were colour, being almost unavoidably needing to be colour, with the technical director having a very expensive precision colour monitor. And the same with video-assist in film productions, using monochrome monitors, to stop the art director continually complaining that something in shot wasn't the colour that they wanted (it was impossible to convince them it was an impossible goal, since everybody watching it on their home TV would see it differently, too, no matter what you tried). With modern sets, default factory calibration became a lot better. You can still pick differences but it's no-longer so glaringly obvious. And it's quite common to see a wall of monitors that are very close to each other in rendition (so long as they're similar models and ages). To custom align two (or more) monitors, you'll need calibration equipment (if you need that precision), or you'll have to eyeball things, while feeding them with a common signal source. e.g. HDMI out from one source, into a splitter, fed to both monitors. If you feed two video monitors from two different sources, whatever is generating their signals may create it differently from each other. If eyeballing, you need to be in an environment with neutral colours around the monitors. Coloured walls nearby will fool your eyes. You'll fall for the same trick butchers employ - green around the meat trays make the meat look redder, because our colour perception is all relative (one thing against another). The standard approach is to view a monochrome test pattern. Adjust each monitor to the same brightness, contrast, and gamma (that's the linearity between black and white, how the greys track between signal generation and picture display). Brightness, or black level, should set up black image areas to be barely illuminated. Contrast should set the maximum white level to be suitable for the viewing conditions. Then, adjust the tint the monitor gives to a monochrome picture, so that there's no apparent tint (it's not blueish, warmish-orange, etc). This has to be done in conjunction with the room lighting, because the monitor will appear different, perhaps very different, depending on ambient light. If you take the monitor elsewhere, or change your room lighting (e.g. day versus night), your monitor will look different. You vision is always comparing one thing against the other. On some monitors, you only have crude tint controls. Overall cold- blue, warm, and so-called neutral somewhere in the middle. Others give you individual red, green, and blue gain controls. Then you try a colour image. In the days of broadcast television there were colour controls to adjust (the amount of colour, or saturation), tint controls for the colour decoder in countries with poor composite colour video systems (i.e. NTSC). But since we no-longer use encoded colour (PAL/NTSC/SECAM), there's next to no actual colour adjustments available. The monochrome setup (above) has configured how the display illuminates, and that includes colour reproduction. If you try and tweak displays looking at colour pictures, you'll misset the the greyscale tracking of monochrome images. And that how most colour signals are done (whether analogue or digital): The base signal is monochrome, the colour signals are the differences from monochrome. Very few things are simply RGB (red, green, and blue primary colours from image acquisition, generation, manipulation, and display). Most become a colour-difference signal somewhere in the middle of that process. HDR (high dynamic range) throws a new spanner in the works. It may use a different gamma than standard (it's supposed to, according to some industry opinions). It will probably illuminate the screen even brighter with stronger input signal data than standard (non HDR). In some countries the black signal level has changed, and that should still produce the same level of non-illumination on the screen. Really, a monitor needs separate calibration and configuration controls for non-HDR and HDR since they're very different signals. If you use the screen to view HDR and non-HDR, it can be a juggling act of how to set up the screen. You may find non-HDR seems murky and HDR harsh and glary. You may find it easier to tweak one monitor to match another if it's one above the other, rather than one beside the other. For some people, that's just easier. And either way, with LCD based screens the viewing angle changes how it looks. You'd have to aim each screen at you, rather then have them flat relative to each other. While they're better, these days, at having wider viewing angles, calibrating them is more finicky than just watching them. -- uname -rsvp Linux 3.10.0-1160.119.1.el7.x86_64 #1 SMP Tue Jun 4 14:43:51 UTC 2024 x86_64 Boilerplate: All unexpected mail to my mailbox is automatically deleted. I will only get to see the messages that are posted to the mailing list. -- _______________________________________________ users mailing list -- users@xxxxxxxxxxxxxxxxxxxxxxx To unsubscribe send an email to users-leave@xxxxxxxxxxxxxxxxxxxxxxx Fedora Code of Conduct: https://docs.fedoraproject.org/en-US/project/code-of-conduct/ List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines List Archives: https://lists.fedoraproject.org/archives/list/users@xxxxxxxxxxxxxxxxxxxxxxx Do not reply to spam, report it: https://pagure.io/fedora-infrastructure/new_issue