Using two GPU's from different brands

Couldn’t figure out which category to put this under but I figure I just need to change my config so I put it here. Anyway:
I recently got some new parts and installed linux, hoping to do PCI passthrough. My frien gave me his old GTX 610 to use for linux, so I could use my RX380 for windows. My problem is:
If I leave my 380 (with which linux was installed) on the “main” PCIe slot, and put the 610 on the secondary one, it isn’t detected, it’s not even listed by lspci.
If I put the 610 on the main slot, and the 380 on the secondary one, I can’t even boot.
If I only put in the 610, I can boot and everything works, so I know the 610 works.

If I can get both cards to show up on lspci I can probably deal with the xorg config part, I just need help with that.

Before doing all this I made sure I had no xorg.conf file, so I am running on the default config.
I am running Arch linux.
My hardware:
ryzen 1700
Asus prime B350-plus
WD black M.2 ssd.
16GB ram running @2400 (iirc)
CM GX 650w.

TL;DR: Get an X370 board.


Which slot are you using as the secondary slot exactly?
As I see it you’re basically using too many PCIe lanes.

The B350 chipset can’t split the 16 PCIe lanes coming from the CPU into two slots (so x8/x8), and only a handful of boards are wired in a way that the 4 M.2 lanes are going to a PCIe slot as well.

Even then though this wouldn’t work because you’re also using 4 lanes on the M.2 already with the WD black.

I haven’t read the manual because… lazy, but here’s the deal. This board has:

  • 1x PCIe 3.0 x16 -> This goes to the primary GPU and cannot be split
  • 1x PCIe 2.0 x16 (x4) -> This could go to a secondary GPU provided you find one that runs on either 2 PCIe 3.0 lanes or 4 PCIe 2.0 lanes (same bandwidth). As far as I know nvidia GPUs require at least 4x PCIe 3.0 lanes to even initialise basically since the dawn of time. Which would be the reason that it isn’t being detected
  • 2x PCIe 2.0 x1 -> not helpful for you in any way
  • 2x PCI -> OK.
  • 1x M.2/​M-Key (PCIe 3.0 x4/​SATA, 22110/​2280/​2260/​2242) -> Those 4 lanes can technically be wired into another slot, but on this board they aren’t. You could find an M.2 to PCIe slot adapter to use those 4 lanes for the second GPU, but if it works depends on the UEFI. Also it’s somewhat fiddly to use the second GPU as the Host GPU (it’s possible though). Then again that slot is already taken by your WD black.

You won’t have any luck with that configuration unless you find a GPU that runs on 4 PCIe 2.0 lanes.

1 Like

Thanks, I’ll get to finding another GPU, then. Thankfully I got this one for free.

Honestly it’s going to be rather hard to find a GPU that still works and is capable of running in PCIe 2.0 x4 mode.

But if you want to go for it, here’s a filter:
GPUs with PCIe 2.0 x1, x8 and x16
Most of the x8 and x16 GPUs “should” work in x4 mode, but no guarantee. The GT 610 is also PCIe 2.0 x16 and it apparently doesn’t work in x4 so… yeah.

To be honest though. If you still have the chance to return that mainboard I personally would rather change the mainboard. Because for the price you’d pay for a (new or used) low-end card after searching for days you already get some X370 boards that aren’t complete garbage. Depends on your local prices of course, but it’s your decision :slight_smile:

most likely this is a matter of switching between driver sets.
Currently the linux “gaming” scene is disturbed by Nvidia being absolutely d.cktards(They got their drivers down to an art, but it is locked down as F…, and i worked with this S…And it is atrocious, they really want you to stay in the environment once you shell out originally).
Amd are retards And give about as many Fucks, for linux as physists care about anything above −273.15 degrees C(which is next to nothing).
Since the money is elsewhere.


Found this. might go for it if I don’t find something used locally.
Returning the motherboard isn’t really an option because I bought it on the US and I’m in Costa Rica, so the shipping is not really worth it.

I managed to get the 610 running: On the bios settings, I was able to limit the second PCIe slot to x2. However, I can only limit the second slot, so I won’t be able to run the host GPU on the “primary” slot.

Now for the next “problem”: when I generate an xorg.conf using

Xorg -configure

The “devices” seem to be on ports 33 and 37 , but when I run lspci they’re on ports 21 and 25, and it actually only goes up to port 27.
Should I use the port numbers on the generated xorg.conf, or the ones on lspci?
I’m leaning towards the ones on the xorg.conf since they seem to get the job done.

Also, my AMD card shows up twice on the xorg.conf file, Card0 on port 37:0:0, and Card1 on 37:0:1. Screen0 uses Card0, and Screen1 uses Card1. Is this normal?

Can’t help you with the configuration there as I haven’t set up a system like this so far, sorry.