Multi GPU / Monitors

Hi, I got a cheap GPU, weird right? Anyways the downside is its only got one DVI output, which doesn’t work for me as I need multiple monitors. Can I just purchase any GPU and throw it in, and then connect the peripheral monitors through that card? Or possibly even only connect monitors through the second GPU? I assume this might have latency penalties if at all possible.
The primary GPU would be an AMD Vega 64 (Asus ROG Strix) and I thought I’d throw in something like an old quadro or similar. I’m running on Linux (Arch latest kernel).
Is this possible and if so what do I have to watch out for?

Should be fine, I run a AMD W6800 + W6600 at my work office and no issues at all with my monitors! maybe mixing GPUs would break something although that I’m not 100% on, you could always fine an old firepro/radeon card for cheap these days if all you need is display out and that should work if it has the same driver versions i think.

Ok…

Ok…

…and now I am am confused.

The Vega 64 should have at least two displayports and a hdmi, or a couple of hdmi, as well as the DVI connector.

Is your issue that all your monitors only have DVI inputs? If so, you can get a passive adapter cable for a couple of dollars

Or

If I have misunderstood your setup please let me know.

Vega64 is a very nice card. Good luck.

2 Likes

Nah, its a ‘special edition’ (maybe mining targeted?). They saved a couple bucks by not implementing/populating the hdmi and displayport interfaces. which is why I could snatch it cheap… idiot me only noticed far to late, and couldn’t turn it back again. At the same time I also realized though that adding a cheap gpu for connectivity is still cheaper then returning this card, and getting the ‘proper’ one. so why not go for that solution. That is I was hoping anyways.

Ah ok, makes more sense

A cheap usb to DVI /hdmi connector should also work for your situation, and won’t take up a pcie slot. Have a look for old docking stations with displaylink. Lenovo ones are usually under $20 and “just work”

The only hang up with passive dvi to HDMI adapters is you’re limited to single link speeds. Generally that’s 1080@60. Not necessarily a deal breaker but something to keep in mind.

Multiple GPUs in windows has worked well for me but my attempts in Linux have been. . . not quite as smooth beyond using native open source drivers to get SOMETHING to display on the screen.

1 Like

You have to watch out for all sorts of things. Sadly most things GTK are now completely broken for Multi XScreen/GPU. If you do a Wayland/randr set up (depending on which) you may see horrid, unacceptable performance. You will also have to hold a lot of packages back. I’m running an old XFCE because they were slowing breaking multiGPU/XScreen during the 4.12 release, anything newer is unusuable as GTK will no longer allow XScreens to be enumerated. Even XScreensaver (as of 6.02) is broken on multiGPU/XScreen if you’re running X.

Given the terrible issues with randr across more than one GPU (and the vague as$ docs about the recently added providers syntax) Wayland will be your only hope barring you go X and really curate the applications you use.

Also note that if you go Wayland you’re throwing a lot away. Wayland is the death of multi GPU. Touted as the future…it’s the grave. While this is an oversimplification it’s still the case…Wayland basically turns your GPU’s into master and slaves (go away PC police). The MASTER GPU will have ALL work dumped on it…ALL work. Also given hierarchy this also means the WORST GPU in the array will denote the maximum capabilities. So if you just for example had an RTX 3090 and an Radeon R5 250…guess who dictates what games you can play? While the slaves basically do nothing but wait for the master to hand them something by way of the frame buffer to output. So if you have say 4 3090’s only ONE does everything and the rest are just $5000 break out boards to more screens…pure waste…pure trash. In the RTX 3090 + R5 250 the Radeon is the master so not only do you lose all the processing power of the 3090 the weakest GPU has ALL the demand placed on it…talk about bass ackwards. You can’t delegate workloads or launch on specific GPU’s like you can with X.

With all that said X is the best choice for Multi GPU…but the software eco system is trying to bully you away from it. Breaking functionality that’s worked for well over 20years…because.

As a side note KDE is kinda working still but has some quirks I couldn’t care to see if you can work around. Also been playing with going super old school with CTWM or anything that adheres to choice rather than this “We’re competing with Apple” BS Gnome/GTK is pushing on everyone.

Thanks for the info, I’ll be running I3 as the DE though a lot of things will obviously still use GTK in the back… I’d stay on X preferably, but I guess I’ll test with some other old cards first, before grabbing one more.

iirc i3 has some extra fun issues with multi GPU but I’m sure you can work it out. Yeah anything gtk will cause you some grief sadly.

Another issue to note which might not be obvious is also applications that use GPU acceleration. (which comically is A LOT of terms…and yet they still don’t have all that fancy 3D hacker crap the 1990’s hacker movies showed ;)) The strong point of using X (yes even on AMD/Radeon ;p) is that GPU separated task delegation but if you have something that uses specific GPU features for acceleration you won’t be able to have a window (or more) on each GPU unless it’s an application that does proper instancing.

For example Brave does this correctly so I have 3 windows open, 2 on one GPU and a separate instance on another. Then Palemoon I also have 2 instances on 2 other separate XScreens. Sakura is my go to term and SpaceFM. Geany I can also instance and I always have instances running on each GPU…

The main take away is lots to watch out for but it can be done. Just shoot a message if you’ve got issues or questions and I’ll help as best I can.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.