4K @ 120Hz HDMI 2.1 - AMD 6000 series, omg

Hi,
hardware → amd 6900 xt + hdmi 2.1 LG C9

I kinda give up and I need help.

I cannot make my 6900 xt output 4k@120hz like in windows. Nvidia 3080 and proprietary drivers, no issue what so ever, with this 6900 I am limited to 4K60Hz. It ALMOST seams it is locked to hdmi 2.0. As of this moment i am running kubuntu 20.10 with kernel 5.11rc7 from ubuntu kernel ppa.

Second issue it pixel format. I am finding it difficult to believe I have to hack EDID to get RGB Full from ycbcr while in Nvidia world, it is a 3 second job.

Man I am really trying to leave windows but this is no fun.

Can you chaps help me out?

Thank you

All the tvs i have had experience with have something where bandwidth is limited on the tv end. You have to turn on full bandwidth on the tv end to get it to work. I have a samsung q90r and a vizio pqx. Both had similar settings for improved bandwidth. I was trying to get 60hz, 444, hdr and 10 bit to work. Wasnt an option until i had the bandwidth limiter disabled. Samsung calls it Input Signal Plus, Vizio calls it HDMI Mode, etc. I had an lg in my possesion and I believe it had this feature, but I do not recall what it is called. When I would on my samsung change video cards or ports on the video card, it reset to being disabled. So this could be your problem.

This is what it looks like on my Samsung.

I mean few ppl are asking for that in the linux world so pushing for that will get not very far really fast.

I’m currently using an LG CX 48" in Linux with 4k@120 and 8b RGB. I’m using an RX 6900 XT as GPU.

It took quite a lot of effort to get here. I tried the EDID route, but didn’t get beyond 4k@120 YCbCr 4:2:0, which is objectively a terrible experience. I also tried manually settings modes with xrandr, but that also lead nowhere.

I eventually concluded that the problem was in the kernel. My thinking was that it was somehow deciding on the ‘wrong’ maximum pixel clock, causing the AMD kernel driver to drop the 4k@>60Hz RGB modes that are advertised in the EDID (SVD 117 and 118). After a surprisingly short search through the AMD driver, I made a few small changes to ignore the max pixel clock sent by the monitor, and that bumped me up to 4k@120 8b RGB. 10b should be doable with a similar small change.

It’s not perfect: sometimes after logging in, the monitor shows ‘no signal’. I then have to restart the monitor to display an image. Audio over HDMI is also broken when using 120Hz.
But oh man, I can’t imagine using a computer at less than 120Hz anymore. This is still by far the best solution I could find.

I downloaded the kernel 5.13.9 source code

I made two changes:
In /drivers/gpu/drm/amd/display/amdgpu_dm/amdgpu_dm.c:adjust_colour_depth_from_display_info() on line 7168 I changed
if (normalized_clk <= info->max_tmds_clock) {
to
if (normalized_clk <= 12000000) {

In drivers/gpu/drm/amd/display/dc/dce/dce_link_encoder.c:dce110_link_encoder_validate_hdmi_output() on line 784 I changed
enc110->base.features.max_hdmi_pixel_clock))
to
(adjusted_pix_clk_khz > 12000000))

I compiled with this script:
sudo echo "need root"
make -j24
sudo make modules_install -j24
sudo make install
sudo rm /boot/vmlinuz-5.13-4k120-x86_64
sudo cp /boot/vmlinuz /boot/vmlinuz-5.13-4k120-x86_64
sudo mkinitcpio -p linux513patched

Reboot, select the new kernel, enjoy 4k120 in Linux using AMD 6xxx GPU :smiley:

This is obviously an ugly solution btw. A more correct approach would be to use max(max_tmds_clock, max(pixel clock for each SVD advertised by the EDID)) as the maximum pixel clock instead of just the constant 12000000.
Even better would be actual HDMI 2.1 support, but hey, that seems to be off the table for now.

6 Likes

@tvd

Well that’s the some dedication lol I couldn’t be bothered to solve problems like this, but that’s why I stay in windows land for my desktop

Appreciate you taking the time to post the solution for others!

I have only mild regrets about how much time I’ve spent on this :smiley:

5 Likes

@tvd

Hey, I have a very similar setup to your own - an LG C1 (rather than CX) 48" monitor and an Radeon RX 6900 XT GPU.

I’m interested in reproducing what you’ve accomplished here, though I have a couple of questions:

  1. Where do you get these 12000000 values from?
  2. You say “10b should be doable with similar small changes”, could you elaborate on this? I can drive my monitor at 4K 120Hz 10-bit 4:4:4 in Windows, so I’d like to do the same in Linux if at all possible.
  3. Did you have to configure any settings in the display itself to accomplish this?

Thanks in advance!

Edit: I tried your suggested changes (though on 5.14.15 rather than 5.13.9; the context you provided was sufficient for me to make the changes in the appropriate place despite things moving around), and I’m still seeing chroma subsampling at 4k 120Hz.

I just tried this on Linux 5.17.4 with an LG C1 48" and unfortunately it didn’t seem to help.

I have a Radeon RX 6900 XT, and I’m using a certificate HDMI 2.1 cable. I am unable to run 4k@120Hz no matter what I try.

The strangest thing is that I can do 4096x2160 @ 120Hz, which is scaled on the display. I am out of ideas, and I’m frustrated. I’ve ordered a DisplayPort to HDMI adapter that claims to support 4k @ 120Hz 4:4:4 10-bit. It does NOT do VRR though, but I suppose I’ll probably just have to live without that.

LG C1 48" on a 6800XT in Linux ARCH (EndeavourOS). (only cyb420); 30bit also works but some stuff don’t like it yet (support trickling out). 120hz, Freesync/VRR is enabled and works.

Only issue is no HDR/AutoHDR yet; VRR don’t work with dual monitor setup, and also I get sometimes stutters in games with dual monitor enabled so I have it setup as single monitor atm which is fine, its a 48" afterall!

I use X11 since Wayland is still not ready as its missing some important features I want and can only get in X11 atm. (like colour/gamma controls).

Using Plasma5 atm with PICOM compositor with un-redirect and experimental backend enabled (gives best results). Works good in XFCE also. Gnome Mutter does the same thing btw.

Is just 420 mode… I was wrong all along. Sad times!

how exactly did you manage to get 444 and 120hz working ? I tried editing the edid, all types of amd clock patches… nothing works … using the latest arch and still no luck :frowning: any chance you have some pointers on how to get this going ?

Yeah I was 100% wrong. Turns out the monitor was running CYB420 mode, the more you know.

Apparently not even full HDMI2.1 support is available on NVIDIA closed source drivers (from what I’m told).

You can figure out what mode the screen is on by pressing green button 7 times.

Anyway, I’m taking a step back from Linux until this whole HDMI2.1 thing gets sorted. Possibly DisplayPort2.0 cables will resolve the issue, but that is a NEXT GEN GPU thing.

It’s a depressing situation for sure. I don’t really like ANY of the DP 34" monitor options which is why I got this C1 in the first place.

Only thing we can hope for is the adapter solution or HDMI Forum stop being dicks about the license use!

1 Like

yeah … the best i managed to achieve is rgb or ycbcr444 at 60hz … for work and text - way way way better than 420 … and even at 60hz the c1 feels way better with less smearing and stuff … i’ll take this at 60 compared to the last 144hz va panel i tried … hoping amd fins a way to release hdmi2.1 … another option will be to give wsl another shot …

Hehe that didn’t last long. Back on Linux. I couldn’t get over the issues I had under Windows, mainly the winbtrfs driver was not working right (games stuttered as they could intermittently be cut off from data reads due to driver issue)

So I thought, hey I’ll just use WSL… well that was broken as hell (couldn’t even enable after following ALL guides/fixes), and you need to enable ALL the telemetry stuff which is a bit sad!

Ultimately fell back to Linux, will put up with the 10-20fps drop in some 4k titles (6800xt) and the colour of ycbcr420 at 120hz 4k isn’t really that bad to be honest, especially if you adjust your gamma correctly.

I’m so pissed at the direction Win10/11 is going; its just so infuriating to work with now, I spent half a day messing with WSL issues because Microsoft want to keep Linux partitions behind that, and behind a insider dev built environment… (again winbtrfs is kind of broken)

It’s going to be exFAT all over again. Wide adoption of a proprietary protocol and zero movement until the originating company realize it’s stifling open source developers.

You are absolutely correct, it’s NEVER happening this GPU generation and adapter companies are the only ones to save us from this. Club 3D needs to be sent a letter describing this as a legitimate concern.

This is likely the same reason Gigabyte can’t include 48Gbps HDMI because FRL is closed source and trade secret level guarded. (Several NDA layers before you can even see a brief of the spec)

Edit: Started a new thread for discussion on this:

That is amazing!
How has this hack been working for you after all these months? Is it stable?
Since what you are doing is an overclock of the TMDS signal, do you believe there might be risks? Or, at worse, we just get an unstable connection?

I’m going to try this myself with my LG C1 and I’ll make sure to post the results.

There’s discussion about this issue going on here (sorry for the weird URL, the forum is not allowing me to post links):
gitlab.------freedesktop.org/drm/amd/-/issues/1417

You might want to chime in and explain in more detail what you’re doing. Perhaps even provide a patch. :grin:

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.