I have a 4k 60hz TV (Vizio V505-G9) and I am unable to push 60hz on Ubuntu 20.04. I have enabled “Full UHD Color” on my TV settings to allow the 60hz signal to go through, but I’m only receiving options to use 3840x2160 30hz at max. I tried forcing it through xandr, but with no luck. I know that my hardware supports it since I can run 4k 60hz on windows. For reference, I have an AMD RX570.
$ cvt 3840 2160 60
# 3840x2160 59.98 Hz (CVT 8.29M9) hsync: 134.18 kHz; pclk: 712.75 MHz
Modeline "3840x2160_60.00" 712.75 3840 4160 4576 5312 2160 2163 2168 2237 -hsync +vsync`
$ xrandr --newmode "3840x2160_60.00" 712.75 3840 4160 4576 5312 2160 2163 2168 2237 -hsync +vsync
X Error of failed request: BadName (named color or font does not exist)
Major opcode of failed request: 140 (RANDR)
Minor opcode of failed request: 16 (RRCreateMode)
Serial number of failed request: 43
Current serial number in output stream: 43
$ xrandr --addmode HDMI-A-0 3840x2160_60
xrandr: cannot find mode "3840x2160_60"
When running the Xrandr command, I get this:
Screen 0: minimum 320 x 200, current 3840 x 2160, maximum 16384 x 16384
DisplayPort-0 disconnected (normal left inverted right x axis y axis)
DisplayPort-1 disconnected (normal left inverted right x axis y axis)
HDMI-A-0 connected primary 3840x2160+0+0 (normal left inverted right x axis y axis) 1096mm x 616mm
3840x2160 30.00*+ 25.00 24.00 29.97 23.98
Strangely enough, 4k 60hz only seems to work for me on linux so far when using Fedora’s gnome desktop environment, I have no idea why that would be the case either. Hopefully that helps somehow.
Thank you for the reply. “Full UHD Color” is a setting on the TV that I need enabled to use things like HDR on my TV. Additionally, it needs to be enabled to allow 4k 60hz, or it can only handle up to 4k 30hz. I’m not sure about the bandwidth thing, but the description for that setting is "If the device supports HDMI 2.0 then Full UHD Color is available. I can confirm this setting is enabled and connected to the HDMI 2.0 port.
If the TV is recognized as a 10 bit display (which it has to be for HDR, I think) and the computer is sending a 4:4:4 signal, 4K 60 might exceed what some HDMI spec is rated at. Windows and Fedora might just be a bit … flexible on that.
I am just guessing here but whenever a TV is in the mix instead of a monitor, that’s where I put the blame until proven wrong.
I don’t really think it’s an issue with the TV itself. It is capable of handling up to 4k hdr 60hz and even supports 4:4:4 chroma like you would expect a monitor to. Additionally, the TV being able to work in Windows on the same machine seem to point to some kind of software issue. I do appreciate your insight though. I can try uploading some kind of video comparison between it running on Windows and Ubuntu if that can help. I will say that it didn’t work at 30hz out of the box even on Windows. By default, it goes to 30hz. I have to go to the display properties and select the 60hz option for 4k, which is only available when I have the “Full UHD Color” setting. I noticed in Ubuntu that it is also detecting my TV as a 49 inch screen instead of a 50 inch screen, so I don’t really think it’s detecting it properly.
Then I saved it as “xorg.conf” then rebooted. By doing this, I accidentally crashed xorg. I had to do:
sudo rm /usr/share/X11/xorg.conf.d/
from the TTY1 terminal thing by doing Ctrl Alt F2 from the login screen to be able to login again. I’m still having issues with 4k 60hz running under Ubuntu. Here’s a screenshot from my Windows partition using my TV’s menu to show that the hardware is capable in supporting 4k 60hz:
I’m having the same problem with a Vizio V436-G1 and Linux Mint 20 (which is based on Ubuntu 20.04) and an Nvidia RTX 2060. After contacting Vizio support and doing my own research, I think the root problem is a Vizio bug that they refuse to acknowledge. My PC works perfectly fine in Linux, displaying [email protected]@4:4:4Chroma while connected to a Samsung 4k TV, but not when connected to this Vizio TV. While connected to the Vizio, the picture appears to only have 4:2:0 chroma, causing bad artifacts in fine text (even when the TV is set to computer mode, and even when the TV has UHD Color enabled).
In Windows I’m able to work around the issue by following Vizio support’s suggestion of forcing the video card’s output color depth to 8 bit and the color format to YCbCr444. Sadly, the Linux video driver doesn’t provide an option to do the same thing.
After reading the Vizio TV’s EDID, I think the root problem is that Vizio TVs don’t properly advertise their capability to display [email protected]@444, so Linux PCs don’t attempt to provide a signal in that mode.
Since I haven’t figured out a work-around in Linux, I’m just going to return the Vizio TV and get a Samsung.
A Gefen 4K HDR HDMI Splitter with EDID management might be able to let you upload a custom EDID to it and then go from there. Other EDID emulators designed for 4K 4:4:4 might not work with it’s presets.
So, I just installed 20.04 on my HTPC which also is running a 570 and it’s fine at 4k 60p. (I know, “works for me” but wanted to mention it.) I would have suggested bad cables next but with @Chris_Preimesberger describing what he found, …
Maybe there is a patch for the TV?
@FurryJackman If my google-foo is correct we are talking about TVs that are less money than one of those splitters?
Which is the big problem, there’s no affordable EDID Detective for 4K 60 HDMI connections that isn’t the legally troubled HDFury. It’s only the splitter that has proper EDID management for HDMI 2.0 connections.
Can confirm using wayland on Ubuntu 20.04 resolves this problem. I was experiencing this same issue, even tried on both this same Vizio TV as well as a Samsung one. While the Vizio one on xorg fails to even identify the [email protected] input in xrandr (though it does still properly show it in the edid), the Samsung TV did properly show it, just gave a no input detected error on any 30+hz value. Some of the suggestions in the thread can resolve this Vizio issue to that of the Samsung one, but what ended up working for me was just changing Ubuntu to default from Xorg to Wayland. In this case the resolution autodetected and everything just worked (To do this just select gear icon on login screen when selecting password and change to wayland). Just wanted to state this out clearly for anyone who sees this thread in the future; there are certainly pros and cons of xorg vs wayland but wanted to just spell it out for those looking to get this up and running.
Just to chime in as I’ve spent my morning solving the same problem, I had use display port and change the display port setting on my samsung monitor from 1.1 to 1.2 to get 4k60 working. So smooth now though!
This is running though the Dell Thunderbolt dock and Pop!_OS 20.04 not Ubuntu.