So I built a PC a while ago, and it was glorious... for a while. I played games on a 21'' 1080p monitor, and all was fine. It handled every game no problem, but now that I upgraded to a 50'' 1080p TV (not an upgrade for some but I like the bigger screen as this is a shared living room PC for me and my roomates who pitched in purchasing the parts.) It now has 20fps less in almost every game. I havn't changed anything else, just the display.
That is very very odd; the resolution is the same, the only differences is the size of the TV, and to the PC that is not a variable for performance.
Question, if you connect the PC back to the org monitor what do you experience? also you said 20 less frames, how did u measure that? or is it subjectively?
Not only is it not a difference to the GPU, the GPU itself @ 1x1080p should be spitting out 120-180+FPS if left uncapped... which would produce screen tearing on a monitor (or TV in this case) less than 144hz... I would check your TV and see it's refresh rating... if you have vsync on, your TV's probably refreshing at 20 FPS slower, cause it's certainly not taxing your GPU
Try valley or some similar benchmark on the monitor and then tv. Take a look at GPU usage. If its rendering slower on the tv it shouldn't be taxing the GPU as much. Another thought might be that motion-interpolation-soap opera shit is turned on and wigging your computer out?
how are you monitoring your FPS? are you just eyeballing it? have you tried using the monitor again to see if the FPS goes back up? how are you connecting to the tv?