Educate me on Linux HDR circa end of 2025

Amazon.com: Acer Nitro 31.5” 4K UHD 3840x2160 Gaming IPS MiniLED Monitor | AMD FreeSync Premium | 160Hz Refresh Rate to FHD 320Hz (DFR) | Up to 0.5ms | 1x Display Port 1.4 & 2x HDMI 2.1 Ports | XV325QK V3bmiipprx : Electronics <–apparently that is on sale for $450. I haven’t paid much attention to monitors for years and I don’t feel like sitting through hours of youtube content to get up to speed. chatGPT hallucinates too much to trust without domain knowledge, so I ask here:

  1. OLEDs can have burn-in. Does mini-LED suffer from any ‘bad’ effects from using it with static images or over long periods? I’m on my PC 10+ hours a day (I know, I know) most days.
  2. I know the blacks won’t be as inky as OLED, but are they close enough to not matter?
  3. What is support for HDR movies/games like on linux (arch) in 2025? I know I have to use wayland to have any hope. The LTT video from a year or two ago claimed HDR on linux wasn’t a thing, but I thought that it got somewhat patched in now? The only games I own that even support HDR would be BG3, Helldivers 2, and Cyberpunk. I’m more interested in watching HDR movies though, I assume VLC can do this? Or do I need a different player? Does it work at all?
  4. Is HDR better left to a dedicated TV? I’d still drive the TV from a mini-pc at least, but I could run Windows on that if I had to. My current monitors all work fine.

I’m an old gamer, so I no longer need 144hz. Despite what Lie-nuts Shill Tips says, i can’t notice 144hz on the desktop. There’s a minimum speed of motion I need to notice 144hz, and I just don’t play games like that anymore. I have a 144hz monitor at the moment, so this is lived experience.

I don’t know what else to ask. It’s a monitor, it displays things. If it’s going to look ‘meh’ I’d rather not spend my money. But I’m not about to drop 1k on a monitor either. I don’t need it, I’m just curious if the moment of (somewhat) reasonably priced (good) HDR is finally here?

Check if the YouTube channel Monitors Unboxed has a review for the monitor.

OLEDs can suffer burn in, but somehow we survived decades of CRTs that also burn in. Mini-LED panels do not burn in, but can suffer catastrophic backlight failure as the LEDs are usually wired up in series. One failed LED and you lose an entire vertical strip of backlight effectively junking the panel.

HDR will be more annoying on a mini-LED as local dimming looks horrible in desktop apps so you will need to constantly switch between HDR and SDR modes.

1 Like

https://www.youtube.com/watch?v=Y_yA9mzfjjs was a previous version of the same panel, so maybe they fixed the OSD, but around 14:38 he mentions if you want to toggle between SDR and HDR, you have to go into the OSD every single time to do it, and the OSD sucks. So…ima keep waiting I guess. For $500 I’m not looking to compromise. $250, sure, but 500? Tech is still warming up I guess.

HDR has kind of arrived on Linux, for certain desktop environments. At least KDE, gnome, Hyprland, and sway do.

Getting e.g. YouTube in HDR is still tricky, it’s experimental on Firefox and crashes on my machine. Chromium and derivatives have it in beta and it’s also still flaky. But implementations are very recent and moving pretty fast. Nvidia seems trickier than Intel/amd at the moment.

On gaming it works but only via Gamescope afaik.

MPV has support, others I’m not sure (I guess since gnome/kde have fundamental support as do ffmpeg etc. their players will follow soon?)

I’m using HDR for a few months on hyprland but mostly because it allows me to change brightness in software rather than on the monitor. 99.9% of what I do is sdr and looks the same since it gets tonemapped to the bigger HDR colorspace. But no issues so far.

I remember LTT explaining why HDR on youtube was a mess. I didn’t expect it to ever work, frankly. I own a lot of 4k blu-rays, so that’s the primary HDR content I’d consume. I figure if I had an HDR monitor, I’d game in HDR as well, assuming it looks as good as everyone claims.

That said, I saw some videos that show local dimming vs oled and they trade. oled is better in dark scenes, mini led local dimming is better for bright scenes. I watch movies that are both light and dark. I don’t want one monitor for each.

In some ways, I expect if I got a monitor and had it on my desk I might go ‘oh, this is great even when it’s sub-optimal’, but $450 is a lot to beta test an idea. Some of that is me just not wanting to return stuff if it doesn’t work.

Oh! One other question. Dolby Vision. On a TV it supports it or it doesn’t, but am I right in thinking that on PC the software (mpv it sounds like?) will understand both HDR10+ and Dolby Vision, so I don’t need to worry about that for a monitor?

I still need to figure out how to encode Dolby Vision 4k with ffmpeg. On one hand, it seems like I might just need to leave it untouched. Or if I do re-compress it, extract the DV metadata, compress it, then put the DV metadata back in. Because my encodes are almost lossless compared to the original, I imagine the HDR data should be drag and drop, but I’d need an HDR monitor to test that theory. Right now I only have 2 4k DV titles (John Wick 3 and The Ten Commandments).

I doubt HDR10+ an DV will ever really work on Linux due to licensing and stuff.

I do believe the monitor should support it though. Don’t know much about HDR10+ (since my monitor or to don’t support it) but DV carries metadata that the monitor should read to map a scene to its capabilities. I.e. if the brightest element is 1000 nits and the monitor can only display 800 it should map accordingly. HDR10 does this too but more rudimentary and not dynamically per scene. The difference between DV and HRD10 is not very big though usually, from what I see on my TV.

Any DV/HDR10+ is extra metadata on top of HDR10 though, so the fallback is always there and gives you 90% of the improvements over SDR IMO.

I‘m not sure how, but there’s ways to remux a UHD blu ray to mkv with hdr and even DV. Google is your friend but you need a supported player and probably also to flash its firmware. But the files exist and I can play them from Jellyfin, a dlna server, or a usb stick on my LG oled and macbook. Or MPV on Linux as HDR10. Looks the same as a native UHD blu ray on my oled tv.

Personally I use a LG C2 42“ oled as a monitor, and a C8 55“ as a to and I’m happy with both. If you have the desk space it’s nice, but you need to have it pretty deep since the pixel pitch is large.

My plan for watching HDR movies is to watch them from a hard drive. i already have a tool chain to rip them to files. The main reason? I like to skip around in movies a lot, and waiting for optical sucks compared to how fast HDDs are. Plus, for things like Star Trek TNG, I can edit out the intros and credits, since I never watch those.

Sorry, many words to say: “I will only ever be watching HDR content played through some form of computer, be it a mini PC, my desktop, or some other ‘full computer’.” I have 0 interest in owning an actual blu-ray player, outside of the slim drive I use for format shifting. That’s why I assumed the player would handle DV/HDR10+, but now that I understand there’s metadata to the display device itself…bleh, what a PITA. Still, like you say, it falls back to normal HDR, so it’s probably pretty minor.

Ah if you can already rip you probably won’t have too much trouble. HDR does work for me on MPV, but haven’t experimented too much since I prefer to watch on the tv over PC.