If Red Dead Redemption 2‘s HDR settings don’t look quite right, that’s because they’re not, according to a Eurogamer Digital Foundry analysis. They claim that even when your console triggers HDR mode on a compatible TV, RDR2 still sends a Standard Dynamic Range (SDR) image, which may make things look strange if your TV is optimised for High Dynamic Range (HDR).
The game itself suggests a setting of 100 nits on LCD screens or 300 nits on an OLED using HDR. I’m using an OLED and found the latter very dim, but the recommended setting too bright, so compromised in the middle.
These recommendations don’t make sense, as OLED screens are normally brighter than LCD displays. The answer is that ‘HDR calibration’ isn’t calibrating HDR at all. Pure white pixels become the brightest image in SDR, so what it’s actually doing is making anything white, like the light from the sun, the brightest element on the screen.
As the Xbox One X or PS4 Pro aren’t delivering any HDR metadata to your TV, some OLED screens may become too dim because they’re waiting for more luminous pixels that never arrive, hence Rockstar’s suggestion to increase the setting to 300 nits on an OLED screen, which just makes everything brighter. All this really affects is luminance, or brightness, as an SDR image can’t offer the same colour or contrast as true HDR.
The report explains that most games with true HDR offer up to 4000 nits of luminance, more than most screens are designed to show, then remove and compress data to deliver an image that works with your TV. Red Dead Redemption 2 is doing this backwards, by starting with an SDR 8-bit image and expanding it to fill the 10-bit HDR range.
Ultimately, increasing the nits setting to something that looks good on your TV will give you something akin to a SDR image. That may still be better than turning it off, as disabling HDR gives you a dim SDR image without the same controls to increase brightness.