A Neverending Story: PCIe 3.0/4.0/5.0 Bifurcation, Adapters, Switches, HBAs, Cables, NVMe Backplanes, Risers & Extensions - The Good, the Bad & the Ugly

I present… the Thuderbolt 4 MULTI-eGPU.
2 AMD Instinct MI60’s on a laptop? hell yeah



6 Likes

Hello everyone, I’ve could use some assistance with a c_payne pcie host adaptor, and this seems like a good place to ask. I created a separate thread…

Any help is appreciated,

Thanks!

lmao love this mad scientist bs. Mind documenting the exact seteup so I can give it a go at some point? :^)

1 Like

The PCIe switch board just needs Gnd, 3v3 and 12v on the edge connector. I found these by just measuring with a multimeter between the pads on the edge connector and respective power pins on the PCIe slots. Then I just soldered a SATA power cable to that (black=Gnd, Orange=3.3V, Yellow=12V). Then connect a riser cable between the computer and the switch board, plug in the GPU’s and it just works.

Then I hooked it up to a Thunderbolt M.2 SSD enclosure via a M.2 to PCIe x4 adapter and that also works. So then you have 2 GPU’s on a single thunderbolt cable.

Because those are 32GB VRAM AMD Instinct MI60 cards, I run some LLM on them using MLC-LLM with ROCm in tensor parallel, and even over Thunderbolt the performance is as good (or even better) than when the MI60’s were plugged into my Desktop Z790 where the GPU’s get PCIe4.0x4 each, because I think while the cards are connected to the host by only Thunderbolt 3, they can DMA to each other via PCIe 4.0x16 when connected via the PCIe switch.

Now I ordered another 2 of those PCIe switch boards to make a single Thunderbolt into 4 GPU’s. And I’m really wondering if those PCIe switches support bifurcation (eg. can do x8 to a GPU and use x4+x4 for some added SSD’s per PCIe slot).

Ultimately I want a 19" rack with GPU’s (and SSDs?), connected with a single thunderbolt cable to my computer or laptop, for running LLMs. So with Thunderbolt I can just turn on/off the 19" rack with GPU’s, mainly to save on idle power (these AMD Instincts will still consume like 25W when doing absolutely nothing, so I don’t want them continuously powered on like they are when inside my computer)

3 Likes

Whoa, for real? :open_mouth:

Sounds fun lol. Out of curiosity, have you seen this?

4 Likes

Whoa, that’s a nice looking board. Anyone got their hands on one?

EDIT: I’d like to see the bottom of that board. I can’t find any documentation and I don’t see any decoupling caps on the TX side (left) of the slots, so I wonder what’s the actual electrical wiring for them.

Do I need to buy one and find out? I would have no way to power the thing unfortunately, and have the same problem with another nvme backplane I am trying to use. lol.

Side note, anyone know where I can find a cpu 8 pin eps to 8 pin eps female to female cable/adapter? I need one for the microhi cable.

1 Like

I have to build a new workspace for doing future experiments, presently don’t have the space to hoard new parts.

I feel that lmao. My workbench and testbed is never done… :^(

I need to find me a tool rail for that workbench for less than an arm and a leg. Impossible rn. Might have to build one… Hmm.

1 Like

Have been following this thread for a while now, but am still not sure how best to approach the thing I’m trying to do.
I would like to have multiple NVMe drives (probably 2) in a consumer chassis with a consumer motherboard (almost certainly gonna be AM5). The drives need to be readily physically swappable (and ideally hot-swap) so I’m thinking either U.2/U.3 drives or M.2 drives in a U.2/3 adapter, since M.2 connectors aren’t rated for lots of insertions.
I would rather not splash out £1000 on a fancy HBA.

Options for achieving this appear to include something like the Icy Dock products for holding U.2/3 drives (and maybe adapters/converters for M.2 to U.2), combined with some M.2 to SFF-8643 (or another of the many NVMe cable/connector types?!) or perhaps PCIe cards with similar connectors.
(One problem I’ve hit is finding a full set of pieces that fit together.)

What would you folks recommend for this?

1 Like

How many lanes do you want to allocate to this, what generation drives would you like it to support, and what form factor do you want the hot swap bays in? I’m assuming 5.25 inch bays.

If you really want icydock for these, it’s gonna cost you. They charge ludicrous amounts for some basic metal box with a fan and pcb. They’re nice, don’t get me wrong, but the cheapest u.2 cage I found from them is 220 bucks. Wtf. Maybe because they seem more “brand name” ig. If you’re willing to be more flexible with brand or speed or packaging, or even skip the hot swap capability, you can drop the price like a brick.

If you really want icydock, this is what I would recommend as a solution, if it works. Icydock pcie stuff has been known to occasionally be pretty picky about the adapters and cables you use, so this may not work. Sorry :frowning:

2x gen 4 u.2 drives, 8 lanes, needs bifurcation. 312$

If you have other parameters for this and want gen 3 or 5 or want 4 drives in 8 lanes etc I’ll try to help you out. Gotta know what you want to help you out, though.

3 Likes

Many thanks for the reply @Rat !

Lanes & generation: 4x gen3 lanes would be the absolute minimum I reckon, but if I went with 2 M.2 slot converters & cables then I’d have 8 lanes at probably gen4 and that would be slightly more desirable. I guess that 2 drives is all I truly need to support for the time being.

I’m planning to buy a consumer X670/X870 (or X670E/X870E) board, fitted with a 7800X3D or more likely a 9800X3D, and I’m not yet sure how flexible they tend to be about splitting the lanes between the PCIe slots (on a first glance, a bunch of boards only offer weird stuff like 16x5.0/4x4.0/2x3.0 or similar). I could live with splitting the primary PCIe slot lanes 8x + 8x with a second slot on boards that offer that. Seems like most boards also share M.2 lanes with at least one of the PCIe slots.

Am not seeing much info about bifurcation on AM5 boards, so that’s probably an entirely separate rabbit hole… Buying a HBA (any cheap & decent ones?) or using a pair of M.2 converters is my baseline plan for avoiding that rabbit hole, but a known-good AM5 mobo recommendation would probably side-step the whole problem.

Form factor: yup, 5.25" bay is likely to be what I’ll go for, but am not stuck on any particular solution cos I simply have so little idea of what decent options exist.

Icy Dock: not even remotely stuck on the idea of using them; if there are better options I’ll gladly look at them. (Their on-site product documentation wasn’t brilliant, and they never replied to a query I sent them.)

I should probably clarify my main use case. It’s pretty simple, so maybe/hopefully I’m missing an obvious way to achieve it:

Summary: my basic need is simply 2 NVMe drives with easy physical swapping of drives, for a RAID1 pair

I have for many years used a pair of disks in RAID1 as the boot volume on my main PC. I swap one out each month (ish) and rotate between a set of 4 or 5 of them. Typical protocol: I shut down the box (which currently runs Windows), remove a drive, boot it back up again, and then connect the next drive (normally the least-recently removed). The hardware RAID controller (Intel Matrix/RST on current machine) detects an old RAID volume on that disk, which I then delete and mark the drive as a spare… rebuild, all good. The reason I like the hot-swap capability is that it means I can boot with a degraded RAID volume (only 1 disk), and then insert an old (unwiped) disk and then trigger a rebuild. If I lose the hot-swap, I’d have to insert both disks before powering up, and then I’d be worried that bad things would happen if the machine chose the wrong volume to boot from :slight_smile: - I’ve never actually tried that though! Clearly, if I were disciplined enough to always remember to wipe the old drive beforehand, it would be OK, but this seems like an accident waiting to happen. Hence I’d love hot swap to be possible.

And now the reason for looking for new hardware: the stupidly easy way I’ve done this for the last 15+ years (on a few different PCs) has been with SATA drives (I also have some M.2 drives but not the boot drive). Not sure how many insertions the SATA connectors are rated for, but they have never given me any trouble.
Naturally, I’d like the PC I’m soon going to build to have a boot volume using NVMe drives, given that SATA bandwidth is now pretty bad by comparison. I don’t like the idea of rotating M.2 drives in their native sockets, or even in a PCIe card, hence the desire to explore other means of connecting them (or indeed U.2/U.3 drives, but they’re more expensive than I expected).

Until looking for solutions, I had fully expected that it would be trivial to achieve this with M.2 drives - e.g. cheap HBA, some cables, some replaceable caddies (1 per drive so they never get removed from the caddies), bingo, job done. But apparently it’s not so simple. This does make me wonder if I’m swimming against the current somehow.

Oh, and after some of the posts earlier in this thread, I don’t imagine anyone will surprised to hear that the UK pricing for the $60 Supermicro HBA above is just bananas: £130 + shipping is typical for the few UK sources (~ $170). It’s actually cheaper to buy and ship a card from the US :rofl:

1 Like

Haha i feel you, it’s so annoying seeing the prices of computer hardware in USA vs UK.

i have been eyeing this as an icydock alternative if you don’t mind the increase in size(i really dont get why icydock products are so expensive, but i guess they have no real competition)

ZhenLoong 4 Bay U.2 NVMe SSD Hot Swap Storage


I’m 90% sure it supports PCIe 4.0, but i can’t say for sure, i guess i assumming it does based on that it uses SlimSAS (SFF-8654), and based on the tests in the description, it looks like they achieved > 5000MB/s on a single drive, which is above PCIe 3.0 x4 speeds

Edit: i asked the seller and they said it supports 4.0

but yeah its 1/3-1/4 the price of an icydock product, but as you can see, its pretty big, 3 x 5.25" slots.

I was going to buy it eventually when Intel W880 motherboards arrive, so i can build an ECC Arrow Lake sever, i just haven’t seen anyone review it on youtube or reddit, or on forums like these.

Also since you only need PCIe 3.0, you should look at getting a

PLX switch card that supports 8 NVMe’s

then all you need is cables.
Screenshot 2024-11-06 at 09.52.38

4 Likes

Nice box - seems to be designed to optionally sit on a desk with external cables.
But wow, it really is BIG. I hadn’t been planning to have 3 unused 5.25" slots, but I might still consider that product if I can find a chassis that will fit the space under my desk and still let me fit in an optical drive with 3 more bays free.
I guess it’ll at least have much better ventilation than the Icy Dock single-slot version! :rofl:

I like that card, thanks! A sensibly-priced gen 3 card would do very nicely until the equivalent gen 4 products drop to sensible prices, and even if they never do, then no biggie (4 lanes at gen 3 isn’t exactly slow).

Do you happen to know if there are any special drivers required for PCIe switch products like that? (I might move to native Linux instead of just running VMs when I need Linux.)

Hmmm that’s a good question, i don’t think PCIe switch cards require drivers, especially on Linux, I’m not sure about Windows.

And yeah like you said PCIe Gen 3 isn’t slow, I’m hoping for the a cheaper Gen 4 card too but i don’t think they’ll drop in price for a while (1-2 years).

Haha i think that enclosure works best in a DIY PC cases, like an IKEA cabinet gaming pc/ home lab customs that you see on Reddit, it’s rare to see PC cases with alot of 5.25" cages these days or you can find a 4U server case which can have up to 9x 5.25" bays

Or I reckon you can put it ontop or next to your pc case, but it seems like the max cable length for SlimSAS is 1M, so it’ll be very tight.

Yesterday night i was looking hard on AliExpress for U.2 hotswap enclosures like the icydocks and i found these 4 listings

Aliexpress Link 1
Screenshot 2024-11-07 at 23.14.03

i googled the model number in the picture and found this website
N-29NVMO

it looks like an upcoming icydock competitor & the prices are better, i even asked the seller if they sell and 4 bay version and he showed me a picture of this
N-49NVMO
for about 1849 CNY / £200 / 260$, close to 1/2 the price of icydock

the next 3 are basically the same product, but different sellers, probably all resellers, i asked one seller and they said it supports PCIe 4.0, soooo wow pretty cheap for £85 / $110 + shipping, it looks like to power it, it uses a molex, and the nvme cables are SlimSAS x4 connectors, also it even looks like you can use it for SATA with the correct cable, and better yet its just a single 5.25" bay!

Aliexpress Link 2
Aliexpress Link 3
Aliexpress Link 4



Screenshot 2024-11-07 at 23.31.17