I’m so done with win11, and currently 12 of my 15 machines are linux anyway, but AFAIK HDR (on nvidia gpu) is still impossible? Are you guys all on AMD or just not using hdr for gaming/media? So instead of relying on outdated info, just asking the pros :)


Thats not quite true, you can do HDR with 4k @ 120hz and HDMI 2.0 but you will be limited to 8bits per channel of which will exhibit pronounced chroma banding, specially noticible in skies gradients. If you lower either resoltution or frequency you can get 10bit back too.
HDMI 2.0 can also support 4k 120hz but it will be limited to 4:2:2 chroma subsampling. It’s fine for the typical TV viewing distance and 2x hidpi scaling but sucks for desktop usage, specially at no hidpi scaling.
You can also get a DP 1.4 to HDMI 2.1 adapter and get full HDR 10bit color and 4:4:4 chroma 4k@120hz at the same time, no problem. The trouble is usually VRR, which tends to be very finicky or not work at al… :(
The point still stands that it’s harder/less-supported on AMD.
And for the record, VRR also works on my setup, but I have it disabled due to flickering caused by the TV-side.