Microsoft adds new Auto HDR features to Windows 11 – Games – News

HDR photography and HDR viewing are two different things.

Your description, which takes two or three shots and extracts the lower and upper values ​​of certain pixels, is a good way to create a high (higher) dynamic contrast in a digital image. This is independent of the color gamut.

When viewed on screen, dynamic range (maximum contrast) has historically been a big problem.

Now, over the past 20 years, monitors have gotten a lot better at displaying color gamut and contrast values. Even older footage (such as VHS or DVD) will now look better (regardless of resolution). Digital image sensors in both color and dynamic gamut have also been improved.

With games, you don’t have the recording sensor problem, but for playback, the source doesn’t matter.

Not all games are only suitable for running in HDR, but a good system should recognize this and not enable the HDR header if the source is not HDR. You can of course choose to always turn on HDR (an option found on consoles, for example) and the image processing will then use artificial HDR, which is often perfectly reasonable.

If you do this with a color gamut, like an sRGB game “spreading” across the Adobe RGB color gamut for example, the result is usually a game with very bright red and green parts, and very high contrast.

It is always recommended to use the correct color reference when viewing.
Oddly enough, TVs do this automatically, but monitors don’t.
HDR is (more or less) separate from the color reference and is actually, more simply, the maximum light output. The higher the HDR rating, the brighter your photo will be (and the higher your power consumption will be).

See also  Qualcomm: Samsung Exynos socs will not be used in Galaxy S23 phones - Computer - News

HDR makes it possible to get a higher value, so that the sun, for example, looks dazzling (no kidding). This is why some people say that HDR only works if you use Local Dimming. While this is not a bad statement in and of itself, it is not necessarily true. You cannot have a value less than 0. The other end of the spectrum is the incrementally higher upper values. You have HDR1000 and now also HDR2000.

It’s personal to everyone of course, but I’m not a huge fan of HDR. It’s not because I have anything against the technology, but the implementation is terrible. I strayed a bit from my answer, but I’d like to explain.

When I’m watching a movie or playing a game, I want to be able to see what’s going on. Not only is it annoying when I’m blind, it hurts my eyes as well. I don’t think it’s healthy either. You also don’t like staring at the sun when you’re outside or putting a flashlight in your eyes.

The menus are often dazzling, and the subtitles are seen as HDR-object and unreadable.
A lamp, for example, is displayed as being too bright and you often don’t see what’s going on in the darker parts of the screen. The average software/game maker thinks this is all beautifully realistic, but that’s not the case. The human eye reacts to and adapts to certain effects…but if you look at a screen where the luminance/chrominance values ​​don’t change as your eye wants and tries, the viewing experience is not pleasant and at the right time is very tiring. This also applies to other fun tricks in the game, such as Depth-of-Field and head bobbing, which are unnatural.

See also  Software Update: Home Assistant Core 2022.6.0 - PC - Downloads

Recording and playback are two separate things.
Does HDR have an added value? This is the character.
It’s good that Microsoft is developing this further for Windows, but it is bad that it is not included by default.

[Reactie gewijzigd door Dazenet op 27 april 2022 04:29]

Leave a Reply

Your email address will not be published. Required fields are marked *