Independent gpu for gaming in concurrent RDP session

I have a PC with a R9 Fury X and a RX 580. Me and my brother use a windows 10 pc concurrently by an rdp session and a direct seating. Yeah the rdp dll is hacked for now to test concurrent user session which is only allowed in windows server. The point is if we play two different games (same games cannot be run because 2 instances of the same exe cannot run in the same machine) only the first GPU is used to render both the direct seated user and the rdp user, results to half the performance in terms of fps.

Now I found a trick to use the 2nd GPU for the rdp user. At first I login to my user and connect my monitor to RX 580 and disable fury X through device manager. 2nd user connects to the PC via rdp and runs a game (dota2). While the 2nd user is connected I enable the Fury X from device manager and connect my monitor to Fury X. Now if I fire up any game, my Fury X starts rendering my session. Voila! 2 gpus running independently for 2 users.

Now the performance is still poor in this case (worse than sharing 1 gpu for both users). Could be because of many other reasons like pci-e lane bandwidth, cpu bottleneck, ram bandwidth. I suspect CPU bottleneck but there is a doubt because CPU utilization is not always at 100% instead it hovers around 70-90% when I am playing Apex legends and my brother is playing dota2.

There is one more thing that I tried with less nefarious tricks. I can assign the iGPU to render dota2.exe by using the game mode>graphics settings in windows 10, where windows assigned iGPU as the power saving GPU and Fury X as the high performance GPU. This method (igpu+fury x) performs better than rx580 + fury. Probably if there was a way to force windows to choose RX580 as the power saving GPU may be resources can be managed in a better way and I could theoritically get the maximum output of both GPUs provided that there is no CPU bottleneck.

Please share your knowledge if you know any way to force windows to choose the 2nd GPU as power saving GPU by some kind of registry tweak or assign 2nd GPU for RDP session. And if anyone of you have a beefier system than me, then give these methods a try and share your results.

My System:
4790k
16GB 1600MHz RAM
RX580+R9 Fury X
windows 10 build 1903
Board: MSI Z97 Gaming 3

3 Likes

please correct me if im wrong here

it seems the goal is for you and your brother to play games at the same time on the same machine, using separate graphic cards.

i dont see how getting 2 windows 10 VMs, each with one of the graphic cards, wouldn’t accomplish this goal.

1st) I don’t want to split my cpu cores in half. When one of us is playing a lightweight game the other user can utilise the cpu more and vice versa. Sometimes I am using solidworks where my cpu utilisation is low but at the same time the other user can utilise most of the cpu cores.
2nd) I don’t want to split the small amount RAM either. Both user can use the RAM dynamically as per the user’s requirement. With VM both user will load independent OSs but in this case core OS modules are loaded once only so RAM is utilised less compared to 2 VMs
3rd) I just wanted to experiment this weird setup, couldn’t find anyone else on the internet except for multi seat setups with a software called aster which achieves all the things I said above and actually performs good on a threadripper system I built for a friend. Aster works better than 2 VM gaming setup since it is native and using RAM and CPU dynamically.

a qemu/KVM setup could be made to suit such dynamic needs, but i wouldn’t recommend that approach due to how complicated that would be.

if you’re willing to abandon windows, a Linux system can be setup to force different GPUs for different user’s graphical sessions.

Very limited game support and software availability in my use case.

Now that is cool, I did not know that was possible.

You might want to look at the people doing driver mods to make mining gpus without outputs work as a high performance gpu. You would need to do the opposite type of mod, with making the 580 show up as a power saving GPU. I don’t know details, or where they could be found, but that is probably the route you want to research.