As far as I know, no. I guess it’s not exactly a good idea globally, as some games sometimes need some changes. For example there’s one or two that don’t like ITM and will have display corruption (at least last time I tested, possibly fixed now), and I have to use some extra flags to get TF2’s mouse controls working in Gamescope.
- 6 Posts
- 10 Comments
Are you using tone mapping through the Steam UI (I think the Deck has its own controls for HDR inverse tone mapping) or through the command line options you can use for games? If you are using the UI, it might be worth using the command line toggles instead as maybe the UI is setting some wrong settings. If it helps, here is the set of command line options I use on my system (modify brightness, refresh rate, and resolution to fit your display)
DXVK_HDR=1 ENABLE_HDR_WSI=1 gamescope -f -r 165 -W 3440 -H 1440 --adaptive-sync --hdr-enabled --hdr-itm-enable --hdr-itm-sdr-nits 350 --hdr-sdr-content-nits 800 --hdr-itm-target-nits 1000 gamemoderun -- %command%
. In addition, it might be worth looking through the display settings to see if it’s in any sort of colour boosting HDR modes - my Alienware had to be set to “HDR Peak 1000” for colours to look as they should, as by default it messes around with things a bit. If you can as well, try some other devices that can output HDR (like a game console or Blu Ray player or something) to see if it’s making those outputs look a bit red too - if so, it’s to do with the display, and if not it’s a configuration issue.
I haven’t experienced issues with oranges on my setup (AW3423DWF, 7900 XTX). Perhaps it is to do with your hardware?
In a nutshell, it essentially increases the range of brightness values (luminance/gamma to be specific) that can be sent to a display. This allows content to both be brighter, and to display colours more accurately as there are far more brightness levels that can be depicted. This means content can look more lifelike, or have more “pop” by having certain elements be brighter than others. There’s more too, and it’s up to the game/movie/device as to what it should do with all this extra information it can send to the display. This is especially noticeable on an OLED or QD OLED display, since they can individually dim or brighten every pixel. Nits in this context refers to the brightness of the display - 1000 nits is far brighter than most conventional displays (which are usually in the 300-500 range).
What sort of system are you on, and what have you been trying? The best setup is with an AMD GPU and a more up to date distro (Fedora, Arch, so on). I can give some help if you need.
Oh boy, I should have caught that. Ironic, considering saying things like “ATM machine” is a pet peeve of mine.
In a small room where it’s the only light source, it’s still a crazy amount of light. My eyes genuinely had to get used to the brightness for a couple minutes after I set it up for the first time, and the walls sometimes looked like the ceiling light was on.
If you ever get the opportunity, try out HDR ITM tone mapping (essentially a HDR upconversion thing you can do with Gamescope on Linux) playing Persona 3 Reload on a QD OLED monitor (for that extra brightness) in a dark room. Even though it’s not even a native HDR game, with ITM it looks so good, especially because it’s a game with a lot of dark graphics mixed in with super bright. The text pops, and combat is next-level.
We don’t talk about
Brunothe Microsoft POSIX subsystem
VESA honestly should ban manufacturers from having anything below HDR 500 certified as HDR, as the specification is so watered down with HDR 400 and below (sRGB? Really? Why not WCG like all the other standards 500 and up?). HDR on a 300 nit display is terrible, and manufacturers should be embarrassed to sell those as HDR.