Failing at 10gbps local network

i am trying to setup a 10gbps network with a few computer here at my office.

i have a file server running Unraid with a ASUS XG-C100C 10gbps rj45 network card.

in one of my workstation i have the 10gbps network card that comes with the asus zenith extreme mobo

to connect it all up im using a Netgear XS505M-100EUS with 5 10gbps ports.

on my unraid server i set the eth interface using the ip leaving everything else as default. on my computer i set everything up as with defaults.

i enabled Tunable (enable Direct IO) in unraid.

and them connected via SMB:// got to a share grabbed a large 10GB file and try to copy that.

here is where all my problems starts i am barely seeing a 40mbps transfer speed.i assume my iron wolf 4TB driver were going to be a limitation but 40mbps?

i also tryed to copy a file from the NVME cache but still super slow (50bmps)

i assume im doing something wrong and there is more i need to configure.

can anyone help me here?

im using cat6A cable and the cable runs are super short. less than 3 meters each cable

Try copying between SSDs on both hosts. Also check the link throughput using iperf.

10g is quite a challenge on a home network, just my humble.

A single rust HDD should do 150mbits reading and 80-100mbits writing. But unraid does some magic in the background. Iā€™m not well versed with it however.

yeah iperf was more confusing than helping, using my gigabit network ip

iperf -c -p 5201
Client connecting to, TCP port 5201
TCP window size: 85.0 KByte (default)
[  3] local port 41196 connected with port 5201
write failed: Connection reset by peer
[ ID] Interval       Transfer     Bandwidth
[  3]  0.0- 7.9 sec   885 MBytes   943 Mbits/sec

now using my 10gbps network

iperf -c -p 5201
Client connecting to, TCP port 5201
TCP window size: 85.0 KByte (default)
[  3] local port 57298 connected with port 5201
[ ID] Interval       Transfer     Bandwidth
[  3]  0.0-10.0 sec   553 MBytes   463 Mbits/sec

the lighs on the switch indicate they are both on 10gbps links

i do see in the server that after the test ends i get this error

iperf3: error - unable to receive parameters from client: Connection reset by peer

Do you have a crossover cable handy? Try cabling directly between hosts, just to rule out the switch.

Those connection reset messages are odd.

yeah, i saw that while unraid was using iperf3 my popos machine is using iperf2 so may be related to that.

i do not have a crossover cable handy but i can make one when i get back home from the office. i will let you know.

thanks for the info.

@gordonthree the crossover cable worked, turn out my problem was i was assuming that eth3 was my pci eth interface.

and eth0,eth1 were the mobo interfaces.

totally wrong, eth0 was the 10gbps interface i used ethtool to check the link capabilities of the interfaces.

now everything is working perfectly.


Glad to hear it was something simple!

1 Like