MSI B350 Tomahawk Review & Linux Test | Level One Techs

Is it true that splitting the cards between the CPU and Southbridge chipset reduces performance?
I understand that Crossfire on X370 runs both cards from the CPU and at x8 speeds, is that correct?

Possibly and yes

Thanks!
I'm looking at a new RX 580 and seeing how high I can bring up my 480 to a comparable speed,

Basically regarding crossfire you'd have to look at each boards' manual. Narrow down your choice with other features you want (and possibly the aesthetics) and then look at the respective manuals. Manufacturers do different things on different boards.

I do believe all the current X370 boards use a x8/x8 config in crossfire, but that might change in the future. That's the problem with so little PCIe lanes, but since x8/x8 doesn't make a real-world difference it's fine I guess.

Well, here is the problem:

I almost bought this product based on "Crossfire support" in the description.
I disagree 8x doesn't make a difference based on the info I've read. It makes a huge difference in benchmarks.
Had I done so, and later found out what I now know to be true---I'd have been salty!
@wendell you might want to amend your review just a tad!
As it turns out, I wound up getting a great deal on an X370 that was less cost than this Tomahawk.
I owe a debt of gratitude to the enthusiast who clued me in on this.

Well, AMD only specifies at least 4 lanes be used for the second (or more) card(s), so technically that isn't wrong. x4 probably will make a difference, x8 however does not.

Not sure which benchmarks you saw, but there's a few videos that show it doesn't make a difference currently (think even Linus did a few over the years).
If it changed with the GeForce 10 or RX400 I'd like to see that too though.

/edit

This was with a 1080, the "differences" are within margin of error.

OK, I will take a deeper look, and if I can find where those figures were posted I'll reply elsewhere as I don't want to take this review anymore off track. This is an interesting discussion!

I had this RAM too, it was pretty wonky. Just a friendly heads up

It really makes no difference. Max 1-2% even if that. This has been fairly well established.

Remember PCIE 3.0 x8 is the same bandwidth as 2.0 x16. GPUs were running fine on that. As we're they on 2.0 x8. So even running a modern GPU on a 3.0 x4 is really not an issue.

Perhaps if the lanes came from the chipset you may see more variance but crossfire is so problematic and variable anyway I doubt you'd notice.

Tbh if you are thinking about crossfire I would avoid it entirely. Just buy a single better card. You will be much happier and no need to mess about with PCIE lanes on lower end chipsets.

2 Likes

What would you suggest? I need 32GB and don't have a ton of money.

dual rank 16gb sticks would be better

1 Like

Was just about to type that lol.

Although Ryzen "supports" Dual-rank in dual-channel at 2400MHz, unofficially it will work at higher speeds. I had some old Dual-rank HyperX Predator that ran at 2933 without issue.

OK, we'll continue the discussion here if nobody objects:

@mihawk90
That video you linked doesn't address Crossfire whatsoever, so let's not get it twisted.

While I respect the opinions of the community, this whole "Avoid Multi-GPU, GO Single Card Solution" Meme was created by Nvidia when they decided to sell their 1080 Series at $700 apiece and thwarted SLI on the 1060 where it made the most sense(as in cents)! This coupled with statements saying Nvidia will no longer support SLI, then flip-flopping on 3-way and 4-way special keys, etc. What is also so sad is how the Tech Tubers just accepted this nonsense, even those who traditionally ran SLI themselves!

Not only is AMD Crossfire is completely different process, AMD has actually advertised it as they played hardball in the sub-$300 GPU market where buying two cards saves a lot of money.
RX 480 x 2 =/> GTX 1080

Stand-by, I'm still searching for where I saw those results.

tech deals is working on some sli testing, but he said game support is super spotty, which sucks.

This is because Nvidia's market share has scared off the Devs from even trying to optimize games.
Really, it's due to the acquiescence of the Tech Tubers and enthusiasts that has allowed this to happen as they are more apt to use the new higher price structure.

Umm no.

Look I know you're really clearly anti nVidia ( I don't like their business practices either) but this conspiracy shit is nonsense. While them limiting 1060 CF to possibly boost 1070 sales is shitty and true it has nothing to do with multi-GPU setups as a whole or Crossfire on the AMD side. Nor is it a new phenomenon.

Multi-GPU from the early days of Voodoo SLI has always been extremely problematic. Getting it to work is often a challenge and when it does scaling is often poor. While there are a few titles that do buck this trend, the vast majority do not. The fact is many people do not buy multi GPUs anyway so there is little motivation for a developer to optimize for it. Furthermore, the extra power consumption and heat produced, as well as the additional space required, make it a difficult option for many. Especially when scaling is not guaranteed. Many times that GPU will be doing nothing or working very little all while producing stuttering which hurts game performance.

It honestly doesn't even make much financial sense either. Two 1060s or two 480s are theoretically capable of being faster than a 1070/1080 but the fact is they often are not and for say $450-500 that you're spending on two GPUs you could just buy a 1070/80 and have a better experience.

I have run two 4850s in the past, I have run two 660s in the past, I have run 7870s in the past, I ran two 290s in the past. I am currently running two 390s now. Hell I even tri-fired 390s once because I wanted to punish my PSU for being a very naughty boy.... I have first hand experience. This isn't some made up "meme" or "conspiracy" as you put it. The problems are real and always have been. I wouldn't recommend it to anyone unless you're already at the top of the stack and need even more power. It just isn't a great experience. It never has been. It is not some new "meme."

They advertised it so hard because AMD doesn't have a single GPU that could even come close to a 1070/80. That was their only option. It's a poor recommendation and often is worse than a 1070/80. It also, as previously mentioned, doesn't really save you anything either.

Perhaps maybe in the future once DX12 multi-adapter and Vulkan multi-GPU support (which btw just gained it it didn't work for a while) mature it will be more viable. For now though or the foreseeable future? No.

9 Likes

While the "Flounders Edition" 1080 was selling for $700, 2 x AIB RX 480s were $560, so I just don't understand how this doesn't save me money. Now, if Vega comes in at $500..

Dude, this thread is about a motherboard review

Take your derail to a new thread

7 Likes

Can we get a block diagrams?
@wendell

1 Like

that motherboard has a nice aesthetic

1 Like