• MudMan@fedia.io
    link
    fedilink
    arrow-up
    2
    ·
    8 hours ago

    Hm. SDR content on HDR containers have been working well for me on both DP1.4 and HDMI 2.1-enabled GPUs, no washed out content, which I did use to have. It did not work over HDMI on an Intel A770 I tested where their weird DP-to-HDMI adaptor didn’t support HDR at all (I hear it’s working on the Battlemage cards, thankfully), but it does work over DP, and it also works well out of the box on my monitors using the integrated HDMI out on a 7 series Ryzen APU, so I’m guessing it’s doing fine on newer AMD cards, too.

    I do believe you that it’s borked for you, and if you’re on a last-gen card with HDMI 2.0 I have seen it do the old “washed out SDR” garbage even under Win11, so I absolutely concede that it’s still crappier than the way more reliable solutions for TV-focused hardware. Still, it works way more reliably than it used to on Windows and it’s way more finicky and harder to set up on Linux than it is on Windows these days (or outright unsupported, depending on your flavor of DE).

    • stevedice@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 hours ago

      I actually upgraded to Windows 11 specifically because I was told they fixed HDR. I do have an RX7600 so it’s technically “last gen” but I’m running DP (I have no idea which version but it has to be at least 1.4 because it runs 1080p at 180Hz). Washed out SDR content isn’t that bad, I actually didn’t even notice until I dragged a window playing SDR content to my second monitor that doesn’t have HDR and the blacks looked, well, blacker. I don’t doubt that it’s worse on Linux, I wasn’t trying to defend it. Just wanted to point out that it seems like no OS that isn’t designed to run only on TVs gives a crap about the HDR experience.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        8 hours ago

        Man, I hated it. The only reason I give Windows (and GPU manufacturers, I suppose) credit for improving it this gen is that I was trying to output PC games to a HDR TV fairly early on and I ended up having to choose between the crappy SDR output or playing worse looking games on console with HDR enabled and it was a dealbreaker. It is a godsend to be able to leave HDR on all the time on my monitors and just… not have to think about it.

        SDR for me now either looks fine as-is or is picked up by AutoHDR and remapped. It now works as I would expect, and at high framerates and resolutions, too, as it seemed to automatically pick out the right type of DSC to fit my setup.

        I’ll be honest, when I got a high refresh rate monitor I was completely sure I wasn’t going to be able to get it all working at once, based on previous experience, but it just did. It sucks to learn that experience isn’t universal. Especially since the RX7600 should have all the hardware it needs to do this. That integrated AMD GPU I mentioned did it all just fine out of the box for me as well and is of that same generation, the 7600 should work the same way.

        The temptation is to try to troubleshoot it with you and suggest it’s a setup issue, but my entire point here is that it should work out of the box every time, or at least tell you what to push to change it if it’s supported, I don’t care what OS we’re talking about.