Any recommended SAS RAID/Expander cards for a LSI MegaRAID SAS 9361-8i?

Can anyone recommend a SAS/RAID Expander card for a LSI MegaRAID SAS 9361-8i?

I’m looking for a PCI style card as it will be installed in a double width computer case but I can make an integrated style work with some “modification”.

I’m looking for reliability+speed+compatibility and a large number of drives. The case I am using supports 20 HDD’s by default and with modifications can support significantly more.

I’m currently using one of those HP expander cards but it is slow and a little bit sketchy. Considering how much money I’ve sunk into the system at this point it makes sense to me to spend 300$ or so to complete the configuration.

OS, drive types (SAS/SATA), backplanes?

Windows 10 64bit
WD Red 4TB drives (Quantity 10) in a RAID6 Array

No backplanes.
Just a double width Lian Li case and old PC hardware (PCI express cards).

Would a picture help?

You check if you have SMR?

They are on the old side so I’m thinking they aren’t SMR.
How do I check?

http://issmrdrive.com/

If it could be a board that can be placed “anywhere” in the case instead of an AIC I can recommend Intel RES3TV360 RAID Expanders.

Sadly even though it has 36 ports in total in can only operate up to 24 drives directly connected to it.

I think the intended connection configuration is:

HBA - two x4 uplinks to SAS3 expander A - one x4 uplink from SAS3 expander A to a second SAS3 expander B

This way you can connect up to 48 drives.

Our data shows that the disk drives WD 4TB Red WD40EFRX are NOT based on SMR technology , the disk drives are based on CMR.

Product ID’s according to MegaRAID:
WDCWD40EFRX68W
WDCWD40EFRX68W
WDCWD40EFRX68N
WDCWD40EFRX68N
WDCWD40EFRX68N
WDCWD40EFRX68N
WDCWD40EFRX68W
WDCWD40EFRX68W
WDCWD40EFRX68N
WDCWD40EFRX68W

don’t know:
WD20EARS (I think this was the original 2TB drive WD made)
WD120EMFZ
WD60EFRX

not SMR:
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD40EFRX
WD30EFRX
WD30EFRX
WD30EFRX
WD80EFAX
WD80EMAZ
WD80EMAZ
WD30EFRX

1 Like

Would the RES3FV288 also be a good option? It also caps at 24 drives.

However, it appears to be more expensive than the RES3TV360 (street price not MSRP) and I would need to plug it into the PCIe for power instead of molex/4pin power.

I wonder if there is a PCI adapter for the RES3TV360 so I can slot it into the chassis like a PCI card? Hmmm…

Any idea how tricky it is to update the firmware on that Intel SAS expander? I saw in the documentation only Intel RAID cards on the supported list…which doesn’t necessarily mean it won’t work with my LSI card but…

not very hard, probably risky as you can brick it but cant be that bad

By chance, updated the firmware of one of mine today with a non-Intel retail Broadcom HBA, so fairly easy, barely an inconvenience…

Just stick to the order of the Readme files Update_firmware and update cpld and watch for Intel’s typos they for some reason refuse to correct.

First update the general firmware, you can copy and alter the CMD command line from the readme file BUT correct the .bat.bat to .bat

After a power cycle, update the CPLD part the same way (don’t know what that’s for), though you have to correct the Readme’s command from Update_CPLD.bat to CPLD_Update.bat (or vice versa).

Again, a power cycle and the expander is up-to-date.

Edit: Also, bundled with the SAS expander board is an EPS 4Pin 12V to Molex Y-adapter.

So you can still connect it to the motherboard but a 12 V-only Molex connector for the expander is on the second end.

Why would they bother including an adapter like that? It looks like a standard Molex/4pin power connector for power to me on the RES3TV360.

When I updated the HP expander card I had to use Linux and buy a couple cheap old RAID cards to get the firmware update to work. Major pain in the @$$.

I suppose the majority of customers for these kinds of components are OEMs with pre-built systems where every PSU connector is spoken for.

But yes, it’s an ordinary Molex connector on the board, you can just use a suitable cable directly from your PSU.

That makes sense. The enterprise world is a weird one.

I finally pulled the trigger after 2 years and bought the RES3TV360.
It’s crazy how you can only connect 24 drives up to it despite it having 9 ports.
It’s also crazy how it costs nearly as much and at most retailers more than what I paid for my LSI RAID6 card!

I also purchased an “Akasa AK-HDA-10BK 2.5” SSD/HDD Mounting Bracket for PCIe/PCI Slot". The plan is to drill a couple holes in it and “make fit”. Basically I want a PCI style mount but with the freedom of MOLEX/4pin for power.

The alternative plan is to harvest a PCI mounting bracket from an old graphics or network card and see if I can find an appropriately sized metal plate at the hardware store or “make fit” with some metal shears which I currently do not own.

Yes, I find this also quite unsatisfactory, see my thread reagarding exactly this limitation:

Previously I thought this expander to be exactly like a network switch where it’s up to the user if you actually use “link aggregation” with multiple connections to a another comparable device or use all available ports for “clients”.

I had seen that on serverthehome or maybe some other site about the 24 HDD limitation. The theory was that it was an artificial limitation of the firmware because the card was usually mated with some Intel disk shelf or server that only had a maximum capacity of 24 HDD’s.

The maximum speed for SATA II is 3.0gbs or 300mb/s. That’s the maximum speed I am supposedly getting from my HDD cards when connected to the HP expander. I don’t know if I will get a performance increase from upgrading to the Intel card and SATA III 6.0gbps but I hope so.

It just doesn’t feel right using some janky used server card pulled from production servers over 9 years ago.

I had bought 2 of them off of ebay 2 years ago and for some unknown reason the HDD’s were only connecting at SATA I 1.0gbps on one of the HP expander cards. That definitely caused a performance hit even on 5400RPM HDD’s.

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.