The 1080p gaming on 4K monitor with Looking glass thought experiment

Hello everyone, I am bashing my head around this and would like to hear your opinions.

I have a TR system with an AMD HD4850 for the Linux host and GTX 960 loaner from a friend for the windows guest. I traded my HD 7950 for the Nvidia card to test the hardware transcoding.

I have been planning a monitor upgrade, from my circa 2009 1080p LG 24", for years. Unfortunately the monitor I want does not exist and if it did I wouldn’t be able to afford it.

Since I want a bump to 32", 4K is the reasonable option for me. I don’t like 1440p because it is not an integer multiple of 1080 and so either 1080p or 4K content reportedly looks funny due to the non integer scaling. 1440p is suitable for mainly gaming in my opinion and I am more of a content consumer and occasional gamer (haven’t gamed for a couple of months).

I am also planning a GPU upgrade in Q4 2019/early 2020 but nothing close to a 2080 ti that can game at 4K 60 comfortably, unless prices drop or Navi destroys.

What I have been thinking about:

  1. 4K is what I want for browsing and content consumption.

  2. 1080p to 4K scaling is bad. Ideally we need integer scaling as explained here: http://tanalin.com/en/articles/lossless-scaling/ that is somewhat available in Linux with Nvidia cards or XrandR. Also might be upcoming in future AMD Windows drivers: https://www.reddit.com/r/Amd/comments/b5r2xy/please_vote_for_gpu_integer_scaling_support_in/

  3. I want to have the option to game. I don’t really play AAA titles, I enjoy strategy and indie such as Prison Architect, Kerbal Space Program, Rust and Natural Selection 2. I do own BF1 thought and the desire to play AAA might arise.

  4. New GPU probably will not be able to handle 4K gaming. I am looking at RX 580, Vega 56/64 and GTX 1060/1070/1080. That means running games at 1080p and upscaling to 4K. This is also viable with the HD 7950 as the gaming GPU.

  5. GPU resetting issues. My beefy-er GPU should probably be allocated to my Windows VM and all gaming should probably be done in Windows VM. Switching the GPU from Windows to Linux and back without reboots might be problematic.

  6. If all else happens, what would the proper configuration be in order to game 1080p on Windows VM and upscale to my monitor?
    A. Set Windows resolution to 1080p, use fullscreen Looking Glass and try to manage scaling on Linux? Does even Looking Glass work that way?
    B. Set Windows resolution to 4K and game resolution to 1080p? That means that the scaling is going to be handled by the Windows GPU driver with no options for integer scaling currently.
    C. ??

Really looking forward to your opinions

I have very little to add to this but I thought I would mention it. I played a game, in windows, at 1080p on a 4k monitor and it looked like hot garbage.

Be interested to see if your idea works. I’m optimistic but skeptical.

1 Like

This is exactly why I got a 32’’ 4K MVA panel. GTX 1080 actually can handle native 4K with reduced settings. That’s your best bet for the games you’re planning to play.

Vega cards to my knowledge still have the reset bug in GPU passthrough.

NVIDIA is still king when it comes to it’s NVENC encoding. I’d get a GTX 1060 6GB if you want NVENC on your host in addition to a capture card like a Magewell. (Looking Glass currently has no direct OBS integration, a capture card is better than Looking Glass for this purpose.)

GTX 1080 + GTX 1060 6GB combo is what I’m currently using with my 4960X and it’s working quite well.

As to Looking Glass scaling, 1:1 is always the way to go in order for it to not look nasty. Borderless Windowed Fullscreen is what I would use, but the current implementation requires really fast memory copy speed for 4K. It’s better at this point to use a hardware KVM if you only use one monitor. (Level1 KVM would work really well here)

For an alternative to Looking Glass, using a direct preview of a Magewell capture card’s input with it’s low latency mode on would perform better than Looking Glass because it’s not relying on DXGI for desktop capture. If that card is isolated on the same die as the PCIE lanes going to your host GPU, you’re not sending any memory operations over infinity fabric, enhancing performance if you’ve pinned the VM to a single die where your guest GPU has PCIE lanes in UMA mode.

1 Like

I should be able to implement this as a shader in the client. Please open a feature request by means of a GitHub issue on the Looking Glass project and when I find some time I will have a look into it.

Edit: done, try the latest master build in git

Upscaled from 320x240

Upscaled from 640x480

2 Likes

You are amazing! Thanks so much for your work! I haven’t had a chance to try it yet but I will. I hope it also gets implemented in the AMDGPU driver at some point.

That is exactly what I am looking at. 32", 4K, VA, specifically the LG 32UD59. May I ask what model is your monitor?

1 Like

i hope you come back and tell us how this went, Im really interested!:smiley:

ViewSonic VX3211-4K-MHD. Has some dark grey uniformity issues, but has IEC power rather than a DC barrel. Too many of these monitors require DC barrels it isn’t funny. Jack's Hardware - ViewSonic VX3211-4K-MHD (and NVIDIA FreeSync) review