The AMD part is actually the opposite, since AMD drivers on Linux can’t do HDMI 2.1, but NVidia can.
Thats not quite true, you can do HDR with 4k @ 120hz and HDMI 2.0 but you will be limited to 8bits per channel of which will exhibit pronounced chroma banding, specially noticible in skies gradients. If you lower either resoltution or frequency you can get 10bit back too.
HDMI 2.0 can also support 4k 120hz but it will be limited to 4:2:2 chroma subsampling. It’s fine for the typical TV viewing distance and 2x hidpi scaling but sucks for desktop usage, specially at no hidpi scaling.
You can also get a DP 1.4 to HDMI 2.1 adapter and get full HDR 10bit color and 4:4:4 chroma 4k@120hz at the same time, no problem. The trouble is usually VRR, which tends to be very finicky or not work at al… :(
Thats not quite true, you can do HDR with 4k @ 120hz and HDMI 2.0 but you will be limited to 8bits per channel of which will exhibit pronounced chroma banding, specially noticible in skies gradients. If you lower either resoltution or frequency you can get 10bit back too.
HDMI 2.0 can also support 4k 120hz but it will be limited to 4:2:2 chroma subsampling. It’s fine for the typical TV viewing distance and 2x hidpi scaling but sucks for desktop usage, specially at no hidpi scaling.
You can also get a DP 1.4 to HDMI 2.1 adapter and get full HDR 10bit color and 4:4:4 chroma 4k@120hz at the same time, no problem. The trouble is usually VRR, which tends to be very finicky or not work at al… :(