Samsung TV not showing proper resolution

I have a old 22" Samsung TV model LA22C450E1, I was trying to connect my desktop to it over HDMI, and its not displaying the proper 1080p resolution, its displaying 1360x768.
My specs -
CPU - i5 6500
GPU - Nvidia GTX 1060 3GB

It may not support 1080 resolution at a refresh rate that your card puts out.

I had the same issue with an old 32" LG TV I had - PC would only display in 1360x768 or so there as well.

I don’t think there’s going to be much you can do to fix it outside of maybe clamping down the refresh rate on your card down to 30hz, if possible, and even then…

edit:
according to a quick google, the spec here:

is only 1680x1050. so maybe try aim for that instead?

I was able to set it to 1080p 30hz, but the text and colors look weird, I don’t know how to explain it, looks blurry and aliased I guess. The TV supports 60hz though, I can’t say if it supports 60hz at 1080p or not.

I have different model you searched for LA22A450C1, I have LA22C450E1

Older TVs don’t support proper chroma and sometimes you also have to deal with non standard pixel layout (not RGB). Just don’t use that thing for monitor purposes, it’s not made for that.

My current monitor was giving me some issues, so I connected this TV temporarily.

1080p 60hz interlaced is also working, but the text is still showing weird. What is the difference between 1080p and 1080i?

I noticed the TV also has a game mode, I turned it on, can’t seem to notice any difference though, brightness and contrast lowered a bit nothing else seems changed.

Text doesn’t care about progressive or interlaced. Here is what makes text look bad on TVs:

and

So, there is no setting that I could change to fix text rendering issue?

I don’t know your specific model of TV but … I doubt it.

1 Like

The model is LA22C450E1 Samsung TV.

Well, I can’t find good information on it which makes me think it is not just an older set but also a low tier one. So …

But at least now you know what to look for.

OK, can you tell me one thing, why does it work with 1080i and not with 1080p?

Because it is what it is. :man_shrugging:

I have had this problem before, it is just a TV issue. Typically it has been older “HD Ready” TVs which itself implys 720p maximum.

In the end I messsed with resolutions and refresh rates until something looked correct and went with that. There have been 1360x768, 1440x900, 16xxX1050, at anywhere along 25hz, 29hz, 29.97hz, 30hz, 59.9xHz and 60hz.

The long and short of it is TVs are not monitors and resolution and refresh rates seem to be a wild west between the TV and set top box which the end user simply does not care about as log as it “just works”. PCs and their users especially those who game tend to be a lot more specific about what is expected and should be, but the TVs just don’t care, that is why there are monitors that are made for the task. I don’t know why that is but that is how it is in my findings.

TV ≠ Monitor

1 Like

1080i is HALF the refresh rate essentially.

Every other scan-line gets updated each cycle, so at “60hz” you get the odd lines updated one cycle and the even ones updated the other cycle for an effective “full refresh” of 30hz. Which is why 30hz works. 1080i @ “60” and 1080p @ 30 are basically the same. Except with interlaced modes you may get “fringing” effects but motion may look smoother because your eyes are slightly fooled by it.

It’s a bandwidth thing - the TV maybe doesn’t have enough display bandwidth to do 1080p. Because its old.

Hmm, thank you for explaining.

Well I got a new monitor today so I guess this issue is resolved. :smiley:

1 Like

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.