(QD-|W)OLED burn in

This is sort of a companion thread to the other I have created and also discussed someplace else.

A few people started to heavily recommend not to buy an OLED, due to burn in issues, especially QD-OLED seems to be affected. (See this rtings article which investigates this issue.) However, I also know from previous videos that @wendell loves his OLED screens if I can remember correctly.

Now, I’m wondering what the experiences of other people are when it comes to OLED in especially QD-OLED in regards to burn in.

I have an LG CX with almost 10,000 hours on it and no sign of burn in for what that’s worth.

1 Like

Perhaps it is time to revisit the good old screensaver?

Burn in can be greatly mitigated if you make sure to run a full screen movie or game for 30 minutes a day. And make sure it is a game that updates all of your screen, preferably with transparent HUD elements.

I would say the advantages outweigh the disadvantages, advising against QLED due to burn in is like advising against NVMe due to the limited write cycles, or not buying an EV due to the charge cycles being limited. Yes, it exists, yes you should be mindful of it but the problem is blown way out of proportion. Might as well not drive a car because it could kill you.

1 Like

I assume that’s a WOLED then. Do you use the CX for gaming, movies and productivity or just one particular task?

:+1:

I’m not even sure yet as to how much I will be using the monitor of my desire for productivity and as mentioned in the other thread its main focus will be gaming. That being said, I assume 3rd person games may result of vague burn in in the middle where the player character is usually located.

I assume QLED stands for QD-OLED, right? In this sense, would you say that the OLED G9 is superior to the 2160p G9? (see other thread) So, would you go for an OLED over a Mini-LED and sacrifice the higher resolution? Might I ask why? (Colour accuracy? Response time? No ghosting? All of them?)

Actually… OLED is still way out of my price point, I usually stick to the sub $200 market. As such I have not done a ton of research on it, just calling out FUD when I see it.

That said, check out some reviews from Monitor Unboxed, perhaps they can help you more.

1 Like

Bit of everything, but mostly games and movies.

1 Like

I am using OLED since 2019, at first the 55" C9 and since 2021 the 48" C1.
It is the perfect Monitor for me, the lower brightness compared to mini LED is not relevant for me, because around my desk it is always relatively dark so that the OLED does not have to run at 100% and I use HDR only for Movies and Games.
The funny thing is with a black desktop background and hidden taskbar, the monitor is always only as big as the open window.

What I do against burin is moving from time to time the HUD of Diablo 4 and run every other week a oled pixel cleaning task.
Pixel shift and automatic brightness adjustment is turned off.
One criticism I often read is that the text clarity would not be as good as with a normal LED. Maybe I’m blind or I have to see a monitor in direct comparison, but I can’t criticize anything. But the Monitor must be in game mode, otherwise the text rendering is horrible, that’s true.

1 Like

Since I will pick either one of the new G9s, I have to wait for a release of these reviews. (However, I would have to wait roughly a year to know more about burn in concerning QD-OLED gen 2) That being said, there is a more generic video of HUB in regards to OLED burn in.

Do you mean by dark having no window in your back or actually dark? Also, I assume HDR is turned off because it becomes brighter and therefore hurts the OLED, right?

I thought this was only the case for QD-OLED due to colour fringing. (Which apparently, is less of an issue with gen 2)

I have an LG C1 48", with no apparent burn-in. I’m not that heavy a user of it, though (at most four hours per day), and I’ve set KDE to turn on an lock screen after ten minutes of inactivity. I also try to avoid leaving windows in the snap zones when I don’t use them, to avoid burning in a big cross.
Wendell has had the same model TV for about as long as I have, but probably with a lot more on hours than mine, and I was curious to hear what his experience had been, so I asked him in the Q&A. He said it’s fine, just run the pixel cleaner every now and then.

Just bear in mind that this model of TV doesn’t have any displayports, only HDMI. And if you use an AMD GPU on Linux, that means no 4k120Hz without dropping to chroma 4:2:2, because the HDMI consortium are total gits. I have discovered that, while it doesn’t have any ports, it does still support the displayport protocol, so if you have a USB-C to HDMI cable and a USB-C port with DP passthrough enabled, it’ll work just fine, 4k120Hz10bpc4:4:4 etc. My motherboard has one of those but my GPU doesn’t, so I plug the screen into the iGPU and let the dGPU render into a framebuffer, like on a laptop. It’s not an ideal monitor, but it does give spectacular screen space for work, and superb picture quality/immersion for games. Elite Dangerous with a space mouse and a webcam head tracker is a spectacular experience.

2 Likes

yes I have no window in my back, that’s better

I mainly use Linux, which doesn’t support HDR anyway, apart from Libreelec.
I only run Windows as a VM for games and then I use HDR and yes HDR has a higher peak brightness, which then puts more strain on OLEDs.

1 Like

[quote=“Susanna, post:9, topic:199084”]
that means no 4k120Hz without dropping to chroma 4:2:2, because the HDMI consortium are total gits
[/quote].

no, I think it was a setting of the HDMI ports to get that running, but 10 bit 4:4:4 120Hz is not a problem, RGB and YCBCR.

1 Like

For those of use using a TV as a monitor something to keep in mind is that all TVs will convert the signal to 4:2:2 chroma for processing unless you’re using PC mode.

1 Like

I mean anything else than PC mode is unusable for Desktop use :wink:
I found it, I thing HDMI Deep Color was off per default and of course PC Mode ist the way to go.

1 Like

You missed the part where I specified on Linux. It was never an issue on windows.

HDMI consortium literally specified “not allowed to implement HDMI 2.1 in anything open source”. Utter. Gits.

3 Likes

ups sorry, you are right, those suckers should be burning in hell…
By pressing the green button with the two dots on the remote 7 times in a row, the TV shows the color bit depth.
YCBCR420 even at 100Hz, with 60Hz it is 444, damm it!

edit: looks like AMD is working on it

5 Likes

Sorry for not responding for quite some time. I had to get ready for an exam and was a bit busy.

That’s good to hear!

The only option for me would be the G9 OLED anyway. I’m not purchasing anything that is not a 32:9 screen.

Yeah makes sense, I would use the monitor first and foremost for gaming and I would naturally intend to turn HDR on.

1 Like

I assume @wendell uses it first and foremost for development and computer janitorial things, i.e. not content consumption but productivity.

1 Like

That’s my assumption as well, and in terms of burn-in it’s basically the worst case scenario. A bunch of static squares all day with very little motion. You can watch films or play video games all day and never get even a hint of burn-in, but OLED as a computer monitor is very quickly going to burn in taskbars and wallpapers and a big central cross.

3 Likes

Personally - and now contrary to the statement made in the other thread - I would probably end up using the display 3-4h for work, i.e. browser, burp, vscode, terminal, and 2h for gaming on a day. Naturally, I would not game every day on it but I would also not use it for work every day.

Thus, the risk of burn-in might be higher than if I was to use the monitor for content consumption alone. That being said, it is unlikely that I would have it set to very bright as I do not have any windows in my back.

Currently, it seems like most arguments against OLED come from people who have not been using OLEDs :smiley: However, I also do not want to end up worrying every single time time I use the screen.

Well, I do use an OLED TV as my monitor, and couldn’t be happier. I’m never going back to VA.

1 Like