AMD Ryzen 7 5700G PCIe issues

Hello,

I am currently working on building my home NAS as well and i found an issue that i need help with. My current specs are (mostly leftovers from older builds):

Motherboard - MSI B450I GAMING PLUS AC - BIOS version 7A40vAG with AGESA ComboAm4v2PI 1.2.0.7
CPU - AMD Ryzen 7 5700G
RAM - 16GB DDR4 3200
Storage - 4 SATA SSDs & 1 NVMe SSD

Today i purchased Synology E10M20-T1 in order to have 10 Gbps and additional storage with 2 NVMe SSDs. The card is brand new, factory sealed, purchased from the retailer, not Ebay or other places. I can’t seem to be able to make it work at all. BIOS does not see the drives, the ethernet NIC does not send any packages. Also Windows nor Ubuntu sees any of the hardware.

I found this topic on this forum, but it’s locked and i can’t reply there. I am wondering if anyone else met this issue.

I tried the following:

  • PCIe slot on AUTO - x16
  • PCIe slot bifurcation - x8/x8 & x8/x4/x4
  • Graphics detection disabled & forced only IGP
  • Both CSM & UEFI modes
  • No other drive connected, other than the 2 on the Synology card
  • Disabled other devices like WiFi and LAN
  • Checked initially with Kingston drive and then with Samsung 970 drives

Nothing helped. Does anyone have any insight or solutions?

Thanks in advance!

What you want/need is a x4/x4/whatever mode. Sometimes this is hidden behind a special PCIe RAID mode option (to enable e.g. a 4xM.2 carrier card). Do you see anything like that?

This mini ITX board has only x8/x8 or x8/x4/x4. Just tried enabling RAID as well but i see no option, nor anything changed in PCIe options. The Synology card is x8 though. I saw Wendell used it recently in a video, but with an AM5 board though.

NOTE: just found this page from AMD forums and it seems for APUs you get x8/x4/x4 while for CPUs you get x4/x4/x4/x4.

I’m sending my buddy an old connect x-2 card as it’s the only x8 device I have lying around to see if his PC recognizes it. He’s running the 5600G on a B550i Aorus Pro ax board and it refuses to recognize the Synology card. I feel bad as I recommended this build to him not knowing the CPU or motherboard bios would cause such issues. This will at least narrow it down to either the card just throwing a fit with the APU chips or the motherboard not properly running at x8. I’ll report back when I have more info.

@FunktasticLucky

Something is off about this situation:

  • The Synology E10M20-T1 is 100 % not a passive PCIe Bifurcation Card like the various models you see for 2 or 4 M.2 NVMe SSDs. I come to that conclusion since with standard PCIe Bifurcation it is IMPOSSIBLE to access 3 separate PCIe Devices (NVMe 1 x4, NVMe 2 x4 & 10 GbE Ethernet Adapter) with 8 PCIe Lanes.

  • A unique scenario could be special BIOS firmware support from Synology necessary for that card, meaning the 8 PCIe Lanes from the Host (their NAS) get bifurcated into x2 (NVMe 1), x2 (NVMe 2) and x4 (10 GbE Ethernet Adapter). In that case the Synology card isn’t compatible with any third-party standard motherboard, since I have never seen this sort of PCIe Bifurcation capability on any “normal” motherboard.

  • The card itself likely has an active PCIe Gen3 Switch that takes 8 PCIe lanes from the Host and generates 12 (?) PCIe Lanes for its devices (NVMe 1 x4, NVMe 2 x4, 10 GbE Ethernet Adapter x4?).

  • This scenario might be bad: Ryzen 4000G and 5000G can only address a maximum of 3 separate PCIe devices on the 16 main PCIe lanes that get generally routed to the two large x16/x8 CPU PCIe slots.

  • That’s why when you have a Ryzen 4000G or 5000G APU installed the only available PCIe Bifurcation options are x8/x4_x4 or x4_x4/x8, x4_x4_x4_x4 (in the main x16 CPU PCIe slot) or x4_x4/x4_x4 (across the two CPU PCIe slots) is missing even though Ryzen 4000G and 5000G have the same amount of 16 + 4 + 4 PCIe Lanes as Ryzen 3000 and 5000 CPUs (but only Gen3 and not Gen4, but that doesn’t have anything to do with PCIe Bifurcation).

The card loaded with 2 NVMe SSDs has 4 PCIe devices:

  1. PCIe Switch
  2. 10 GbE Ethernet Adapter
  3. NVMe 1
  4. NVMe 2

Have you tried only using one of the two M.2 NVMe slots?

Don’t enable PCIe Bifurcation in the UEFI, leave it on Auto or x8 for that slot. If you have another PCIe device in the other CPU PCIe slot then only two more PCIe devices can be operated, in this case try not installing any NVMe drive on the Synology card and check if the ethernet adapter is detected then.

Also check if these BIOS options are set in the way shown on the screenshots (“Auto” doesn’t automatically mean Enabled), these settings “help” handle multiple PCIe Devices in a system, my personal record is 9 NVMe SSDs on X670E without additional PCIe Switch or Tri-Mode HBAs.

It sucks that AMD APUs have lesser PCIe capabilities compared to the regular Ryzen CPUs, the situation has become even worse with the new 8000G APUs :frowning: