It also applies if you have 10 gigabit adapters in both client and server -- these network adapters are so fast that the data typically outpaces what a single CPU core can keep up with. In that case, multichannel still provides a benefit because it splits up the data stream so that multiple CPU cores help handle the workload of moving the data around.
All of the following transfers were done from SSD to SSD Another thing to note is that the Desktop are on two different switches, which are linked with a 4x1Gbps LAG
I have got SMB Multichannel to establish from my Desktop to my NAS, as seen in the following Powershell window
When it works it works quite well, but when it doesn't it's very hard, if not impossible to fix, mainly because of the lack of specific documentation, which I'm happy that this video/post because it's helping to remedy that issue. My config was set up a while ago and I didn't know whether it would work at all using different NICs, over an LACP link, across different OS. It's a really nice feature of Windows, but it needs more information published about it before if can really come into it's own, in my personal opinion. As for this post, I'm not really looking for a solution, I'm not too fussed about it at this stage, but if you have any ideas about how to fix it they are most welcome, but I mainly made this post to share my configuration and to share my experience using SMB Multichannel.
Glad to see a Mikrotik router on your channel there @wendell I''ve been using them at work for about a year now for many clients and the amount of stuff you can do with them easily is fantastic (although im still trying to get my head around vlanning on them). I currently have the CRS-125-25G rack mount version for my home network.
I will try this out when I run network cables in my new house and see how I go... thankfully I think ill have an easy time. My server runs MS Server 2012 R2 and all my computers (bar my media center pc) run Win 10.
I may be able to help with some basic config on the Mikrotik as a lot of the stuff I have seen is by people who clearly speak english as a second or third language so its not always 100% clear what they are saying.
What the H-E-Double Hockey Sticks!!! Someone making YouTube content on a regular basis with more in-depth useful information, followed up with written articles and laced with witty geek humor? Please stop immediately and just make 7 min or less videos that are basically just extended commercials for products. (<<< satire)
For real though, Wendell/L1Techs videos are quickly becoming the best tech-tube content out there, if you want more than just an on the surface product review that doesn't tell you much more than what would be written on a spec sheet. Please keep making more videos like this one. Kudos to you good sir.
I had a 2012r2 server that I attempted to accomplish this but never could get it to work correctly (in 2014). I attempted to use each NIC on a separate VLAN on the same switch, different VLAN on a different switch but never did get anywhere with it.
Not at that job anymore, but it was frustrating. I eventually setup independent NICs to independent VMs though so that accomplished somewhat of what I wanted anyway, I suppose.
This video reminds me. I'd really love a series of videos on setting up a nice, nerdy and secure home network. For example a small network with a few wired and wireless machines, a NAS, maybe some streaming devices, htpc etc... Level1HomeNetwork if you will.
Building your own router
Setting up a proper wireless network with WAPs
Properly utilizing a NAS either one built from scratch or not. I have a NAS but I don't think I'm using it to its full potential. So proper scheduled backups.
Setting up a small web server for fiddling around
Setting up you own cloud {Like other videos Wendell has made}
Accessing a system remotely. So like remote desktop and such but actually secure with out just opening up your pc to the wild wild west.
I'm a noob and sometimes I get overwhelmed with all you can do with a home network that I end up spinning my wheels. If someone else is interested in something similar I can make a Forum thread for it.
My NAS (WD MyCloud) has 2 ethernet ports and I can bond them. My desktop has 1 GBe port and wireless of some kind. Could I bond these to get at least a little better speeds to my NAS?
yes you are correct however it does look promising the Linux version of SMB Mutichannel looks like samba version 4.4.0 is starting to development. I like the statement NOT RECOMMENDED TO USE MULTI-CHANNEL IN PRODUCTION sound like not for the faint at hart.
Hey Wendell and Awesome Techy’s on the Forum. I love reading Content here, but this is my first time posting. I know this is an old Thread, but this Thread helped me set up SMB3 Multi-channel on multiple systems for clients, now I am finally setting it up for my self at my house, and I am stumped. I have a 2 4 Port HP 365t Nics, one in my “Server” running Windows 10, one in my Workstation also running Windows 10. I have used all the Powerscript tools learned in this post to ensure that I am indeed running in a multi-channel configuration. However…
When Transferring from my Server To My Workstation, I am getting Full speed, it is awesome seeing transfers happen at 400MBps+. But when transferring back from my Workstation to my “Server” I am only getting 110MBps consistently. When I check usage in Task Manager, it is indeed using all 4 Ports, but each port is only utilized at exactly 25 Percent.
Some more Background info; The test is being conducted with 500GB Samsung 850 Evo’s on both Sides, and both are working properly and able to utilize full speeds when transferring locally. I have played with RSS Quite a bit, and unfortunately when I step up the RSS Threads on the Server Side, it only ends up slowing down the Speed. Now here is the rub, and I know is most likely part of the problem, but I was hoping you guys could help me around it in your wisdom…
My “Server” is just my old retired workstation. Its an FX 8370 System that used to be my main system before i upgrade to Ryzen earlier this year. But its got a Modest Stable OC to 4.5Ghz, its got 32GB of 2400Mhz Ram with Tight Timings. And it runs Beautifully otherwise. I am using it as my PVR and NAS, and it does a great job at both, and the fact that I can access files from it at 400MBps is awesome, I just wish I could transfer to it at somewhere around the same speed. I was hoping you guys could help me configure something I maybe missed on the server side.
Maybe someone has experience setting this up with FX Processors and has a trick up their sleeve to fix it. CPU Utilization only seems to peg one or 2 cores during a transfer so it seems like there is head room in this processor to give me a faster transfer speed over the NIC, just not sure what I am doing wrong. And Also I have confirmed, my Nic is Slotted in a x4 Slot and is running at full speed, just to be sure I even threw it in a x8 slot and got identical results. Any help would be appreciated. Here is a screen shot of what my Task Manager Looks like while in a transfer to the “Server”. Thank You Guys!!
Did anyone work out or have any luck when doing this where a linux box is the client rather then windows? In my case I am trying to have plex on ubuntu be then client with a CIFS mount
The only thing I could find in the cifs docs as all you need it to server vers=3 or better, but the max speed i can get is 108~109 MB/s with DD. and using mount -t cifs it only return 1 addr. @RageBone if you have a link for GVFS around smbmc that would be awesome.
@thefathacker
Well, i used to use the filemanager thunar under manjaro, which comes with gvfs. That used to work okish until i needed to access the share from other applications.
Gvfs doesn’t mount the share to the system and i don’t know if you can configure it to do that.
So, if you have a Samba Server somewhere, smbstatus -dX is your friend.
It can at least tell you which protocol is being used.
And then you watch out for your CPU usage.
I guess its working for me since both, client and server have a very nicely spread cpu utilization.
To my particular setup, i’m using Mellanox CX3 VPI cards set to 40GbE.
Iperf is capable of 25Gbits, limited by cpu single thread perf on a E5 2628L V4 on the server.
SMB performance is pretty much limited by the two Raid1 WD reds i’m using.
Consistent 400MB/s until some cache is full and it stops to write it off to disk.
I had it at 800MB/s from some cache to client.